00:00:00.001 Started by upstream project "autotest-nightly-lts" build number 2002 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3263 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.020 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.022 The recommended git tool is: git 00:00:00.022 using credential 00000000-0000-0000-0000-000000000002 00:00:00.025 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.040 Fetching changes from the remote Git repository 00:00:00.045 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.059 Using shallow fetch with depth 1 00:00:00.059 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.059 > git --version # timeout=10 00:00:00.086 > git --version # 'git version 2.39.2' 00:00:00.086 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.112 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.112 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.260 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.271 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.281 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.281 > git config core.sparsecheckout # timeout=10 00:00:02.292 > git read-tree -mu HEAD # timeout=10 00:00:02.308 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.326 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.326 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.524 [Pipeline] Start of Pipeline 00:00:02.537 [Pipeline] library 00:00:02.538 Loading library shm_lib@master 00:00:02.538 Library shm_lib@master is cached. Copying from home. 00:00:02.551 [Pipeline] node 00:00:02.568 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.569 [Pipeline] { 00:00:02.578 [Pipeline] catchError 00:00:02.579 [Pipeline] { 00:00:02.588 [Pipeline] wrap 00:00:02.595 [Pipeline] { 00:00:02.601 [Pipeline] stage 00:00:02.602 [Pipeline] { (Prologue) 00:00:02.777 [Pipeline] sh 00:00:03.057 + logger -p user.info -t JENKINS-CI 00:00:03.076 [Pipeline] echo 00:00:03.078 Node: WFP20 00:00:03.085 [Pipeline] sh 00:00:03.375 [Pipeline] setCustomBuildProperty 00:00:03.384 [Pipeline] echo 00:00:03.385 Cleanup processes 00:00:03.389 [Pipeline] sh 00:00:03.662 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.662 3463062 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.675 [Pipeline] sh 00:00:03.991 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.991 ++ grep -v 'sudo pgrep' 00:00:03.991 ++ awk '{print $1}' 00:00:03.991 + sudo kill -9 00:00:03.991 + true 00:00:04.002 [Pipeline] cleanWs 00:00:04.009 [WS-CLEANUP] Deleting project workspace... 00:00:04.009 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.014 [WS-CLEANUP] done 00:00:04.017 [Pipeline] setCustomBuildProperty 00:00:04.073 [Pipeline] sh 00:00:04.348 + sudo git config --global --replace-all safe.directory '*' 00:00:04.426 [Pipeline] httpRequest 00:00:04.440 [Pipeline] echo 00:00:04.442 Sorcerer 10.211.164.101 is alive 00:00:04.448 [Pipeline] httpRequest 00:00:04.452 HttpMethod: GET 00:00:04.452 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.453 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.454 Response Code: HTTP/1.1 200 OK 00:00:04.455 Success: Status code 200 is in the accepted range: 200,404 00:00:04.455 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.884 [Pipeline] sh 00:00:05.161 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.173 [Pipeline] httpRequest 00:00:05.200 [Pipeline] echo 00:00:05.201 Sorcerer 10.211.164.101 is alive 00:00:05.208 [Pipeline] httpRequest 00:00:05.212 HttpMethod: GET 00:00:05.212 URL: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:05.213 Sending request to url: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:05.220 Response Code: HTTP/1.1 200 OK 00:00:05.221 Success: Status code 200 is in the accepted range: 200,404 00:00:05.221 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:12.130 [Pipeline] sh 00:01:12.420 + tar --no-same-owner -xf spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:14.970 [Pipeline] sh 00:01:15.255 + git -C spdk log --oneline -n5 00:01:15.255 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:01:15.255 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:01:15.255 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:01:15.255 e03c164a1 nvme: add nvme_ctrlr_lock 00:01:15.255 d61f89a86 nvme/cuse: Add ctrlr_lock for cuse register and unregister 00:01:15.267 [Pipeline] } 00:01:15.283 [Pipeline] // stage 00:01:15.292 [Pipeline] stage 00:01:15.294 [Pipeline] { (Prepare) 00:01:15.312 [Pipeline] writeFile 00:01:15.328 [Pipeline] sh 00:01:15.614 + logger -p user.info -t JENKINS-CI 00:01:15.626 [Pipeline] sh 00:01:15.911 + logger -p user.info -t JENKINS-CI 00:01:15.927 [Pipeline] sh 00:01:16.215 + cat autorun-spdk.conf 00:01:16.215 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.215 SPDK_TEST_FUZZER_SHORT=1 00:01:16.215 SPDK_TEST_FUZZER=1 00:01:16.215 SPDK_RUN_UBSAN=1 00:01:16.222 RUN_NIGHTLY=1 00:01:16.227 [Pipeline] readFile 00:01:16.259 [Pipeline] withEnv 00:01:16.262 [Pipeline] { 00:01:16.278 [Pipeline] sh 00:01:16.563 + set -ex 00:01:16.563 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:16.563 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:16.563 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.563 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:16.563 ++ SPDK_TEST_FUZZER=1 00:01:16.563 ++ SPDK_RUN_UBSAN=1 00:01:16.563 ++ RUN_NIGHTLY=1 00:01:16.563 + case $SPDK_TEST_NVMF_NICS in 00:01:16.563 + DRIVERS= 00:01:16.563 + [[ -n '' ]] 00:01:16.563 + exit 0 00:01:16.573 [Pipeline] } 00:01:16.591 [Pipeline] // withEnv 00:01:16.597 [Pipeline] } 00:01:16.614 [Pipeline] // stage 00:01:16.625 [Pipeline] catchError 00:01:16.627 [Pipeline] { 00:01:16.643 [Pipeline] timeout 00:01:16.643 Timeout set to expire in 30 min 00:01:16.644 [Pipeline] { 00:01:16.661 [Pipeline] stage 00:01:16.663 [Pipeline] { (Tests) 00:01:16.678 [Pipeline] sh 00:01:16.966 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:16.966 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:16.966 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:16.966 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:16.966 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:16.966 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:16.966 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:16.966 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:16.966 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:16.966 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:16.966 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:16.966 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:16.966 + source /etc/os-release 00:01:16.966 ++ NAME='Fedora Linux' 00:01:16.966 ++ VERSION='38 (Cloud Edition)' 00:01:16.966 ++ ID=fedora 00:01:16.966 ++ VERSION_ID=38 00:01:16.966 ++ VERSION_CODENAME= 00:01:16.966 ++ PLATFORM_ID=platform:f38 00:01:16.966 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:16.966 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:16.966 ++ LOGO=fedora-logo-icon 00:01:16.966 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:16.966 ++ HOME_URL=https://fedoraproject.org/ 00:01:16.966 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:16.966 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:16.966 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:16.966 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:16.966 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:16.966 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:16.966 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:16.966 ++ SUPPORT_END=2024-05-14 00:01:16.966 ++ VARIANT='Cloud Edition' 00:01:16.966 ++ VARIANT_ID=cloud 00:01:16.966 + uname -a 00:01:16.966 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:16.966 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:19.506 Hugepages 00:01:19.506 node hugesize free / total 00:01:19.766 node0 1048576kB 0 / 0 00:01:19.766 node0 2048kB 0 / 0 00:01:19.766 node1 1048576kB 0 / 0 00:01:19.766 node1 2048kB 0 / 0 00:01:19.766 00:01:19.766 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:19.766 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:19.766 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:19.766 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:19.766 + rm -f /tmp/spdk-ld-path 00:01:19.766 + source autorun-spdk.conf 00:01:19.766 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:19.766 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:19.766 ++ SPDK_TEST_FUZZER=1 00:01:19.766 ++ SPDK_RUN_UBSAN=1 00:01:19.766 ++ RUN_NIGHTLY=1 00:01:19.766 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:19.766 + [[ -n '' ]] 00:01:19.766 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:19.766 + for M in /var/spdk/build-*-manifest.txt 00:01:19.766 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:19.766 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:19.766 + for M in /var/spdk/build-*-manifest.txt 00:01:19.766 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:19.766 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:19.766 ++ uname 00:01:20.026 + [[ Linux == \L\i\n\u\x ]] 00:01:20.026 + sudo dmesg -T 00:01:20.026 + sudo dmesg --clear 00:01:20.026 + dmesg_pid=3463965 00:01:20.026 + [[ Fedora Linux == FreeBSD ]] 00:01:20.026 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:20.026 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:20.026 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:20.026 + [[ -x /usr/src/fio-static/fio ]] 00:01:20.026 + export FIO_BIN=/usr/src/fio-static/fio 00:01:20.026 + FIO_BIN=/usr/src/fio-static/fio 00:01:20.026 + sudo dmesg -Tw 00:01:20.026 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:20.026 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:20.026 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:20.026 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:20.026 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:20.026 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:20.026 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:20.026 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:20.026 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:20.026 Test configuration: 00:01:20.026 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:20.026 SPDK_TEST_FUZZER_SHORT=1 00:01:20.026 SPDK_TEST_FUZZER=1 00:01:20.026 SPDK_RUN_UBSAN=1 00:01:20.026 RUN_NIGHTLY=1 21:23:58 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:20.026 21:23:58 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:20.026 21:23:58 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:20.026 21:23:58 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:20.026 21:23:58 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.026 21:23:58 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.026 21:23:58 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.026 21:23:58 -- paths/export.sh@5 -- $ export PATH 00:01:20.026 21:23:58 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.026 21:23:58 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:20.026 21:23:58 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:20.026 21:23:58 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720812238.XXXXXX 00:01:20.026 21:23:58 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720812238.ZiTaZs 00:01:20.026 21:23:58 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:20.026 21:23:58 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:01:20.026 21:23:58 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:01:20.026 21:23:58 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:20.026 21:23:58 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:20.026 21:23:58 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:20.026 21:23:58 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:20.026 21:23:58 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.026 21:23:58 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:01:20.026 21:23:58 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:20.026 21:23:58 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:20.026 21:23:58 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:20.026 21:23:58 -- spdk/autobuild.sh@16 -- $ date -u 00:01:20.026 Fri Jul 12 07:23:58 PM UTC 2024 00:01:20.026 21:23:58 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:20.026 LTS-59-g4b94202c6 00:01:20.026 21:23:58 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:20.026 21:23:58 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:20.026 21:23:58 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:20.026 21:23:58 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:20.026 21:23:58 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:20.026 21:23:58 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.026 ************************************ 00:01:20.026 START TEST ubsan 00:01:20.027 ************************************ 00:01:20.027 21:23:58 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:20.027 using ubsan 00:01:20.027 00:01:20.027 real 0m0.000s 00:01:20.027 user 0m0.000s 00:01:20.027 sys 0m0.000s 00:01:20.027 21:23:58 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:20.027 21:23:58 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.027 ************************************ 00:01:20.027 END TEST ubsan 00:01:20.027 ************************************ 00:01:20.286 21:23:58 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:20.286 21:23:58 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:20.286 21:23:58 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:20.286 21:23:58 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:20.286 21:23:58 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:20.286 21:23:58 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:20.286 21:23:58 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:20.286 21:23:58 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:20.286 21:23:58 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.286 ************************************ 00:01:20.286 START TEST autobuild_llvm_precompile 00:01:20.286 ************************************ 00:01:20.286 21:23:58 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:01:20.286 21:23:58 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:20.286 21:23:58 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:01:20.286 Target: x86_64-redhat-linux-gnu 00:01:20.286 Thread model: posix 00:01:20.286 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:20.286 21:23:58 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:01:20.286 21:23:58 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:01:20.286 21:23:58 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:01:20.286 21:23:58 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:01:20.286 21:23:58 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:01:20.286 21:23:58 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:01:20.286 21:23:58 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:20.286 21:23:58 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:01:20.286 21:23:58 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:01:20.286 21:23:58 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:20.546 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:20.546 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:20.806 Using 'verbs' RDMA provider 00:01:36.640 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:48.915 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:49.174 Creating mk/config.mk...done. 00:01:49.174 Creating mk/cc.flags.mk...done. 00:01:49.174 Type 'make' to build. 00:01:49.174 00:01:49.174 real 0m29.079s 00:01:49.174 user 0m12.528s 00:01:49.174 sys 0m15.912s 00:01:49.174 21:24:27 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:49.174 21:24:27 -- common/autotest_common.sh@10 -- $ set +x 00:01:49.174 ************************************ 00:01:49.174 END TEST autobuild_llvm_precompile 00:01:49.174 ************************************ 00:01:49.174 21:24:27 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:49.174 21:24:27 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:49.174 21:24:27 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:49.174 21:24:27 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:49.174 21:24:27 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:49.744 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:49.744 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:50.004 Using 'verbs' RDMA provider 00:02:03.169 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:15.451 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:15.451 Creating mk/config.mk...done. 00:02:15.451 Creating mk/cc.flags.mk...done. 00:02:15.451 Type 'make' to build. 00:02:15.451 21:24:52 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:15.451 21:24:52 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:15.451 21:24:52 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:15.451 21:24:52 -- common/autotest_common.sh@10 -- $ set +x 00:02:15.451 ************************************ 00:02:15.451 START TEST make 00:02:15.451 ************************************ 00:02:15.451 21:24:52 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:15.451 make[1]: Nothing to be done for 'all'. 00:02:16.018 The Meson build system 00:02:16.018 Version: 1.3.1 00:02:16.018 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:16.018 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:16.018 Build type: native build 00:02:16.018 Project name: libvfio-user 00:02:16.018 Project version: 0.0.1 00:02:16.018 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:16.018 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:16.018 Host machine cpu family: x86_64 00:02:16.018 Host machine cpu: x86_64 00:02:16.018 Run-time dependency threads found: YES 00:02:16.018 Library dl found: YES 00:02:16.018 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:16.018 Run-time dependency json-c found: YES 0.17 00:02:16.018 Run-time dependency cmocka found: YES 1.1.7 00:02:16.018 Program pytest-3 found: NO 00:02:16.018 Program flake8 found: NO 00:02:16.018 Program misspell-fixer found: NO 00:02:16.018 Program restructuredtext-lint found: NO 00:02:16.018 Program valgrind found: YES (/usr/bin/valgrind) 00:02:16.018 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:16.018 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:16.018 Compiler for C supports arguments -Wwrite-strings: YES 00:02:16.018 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:16.018 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:16.018 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:16.018 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:16.018 Build targets in project: 8 00:02:16.018 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:16.018 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:16.018 00:02:16.018 libvfio-user 0.0.1 00:02:16.018 00:02:16.018 User defined options 00:02:16.018 buildtype : debug 00:02:16.018 default_library: static 00:02:16.018 libdir : /usr/local/lib 00:02:16.018 00:02:16.018 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:16.277 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:16.536 [1/36] Compiling C object samples/null.p/null.c.o 00:02:16.536 [2/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:16.536 [3/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:16.536 [4/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:16.536 [5/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:16.536 [6/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:16.536 [7/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:16.536 [8/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:16.536 [9/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:16.536 [10/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:16.536 [11/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:16.536 [12/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:16.536 [13/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:16.536 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:16.536 [15/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:16.536 [16/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:16.536 [17/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:16.536 [18/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:16.536 [19/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:16.536 [20/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:16.536 [21/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:16.536 [22/36] Compiling C object samples/server.p/server.c.o 00:02:16.536 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:16.536 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:16.536 [25/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:16.536 [26/36] Compiling C object samples/client.p/client.c.o 00:02:16.536 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:16.537 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:16.537 [29/36] Linking static target lib/libvfio-user.a 00:02:16.537 [30/36] Linking target samples/client 00:02:16.537 [31/36] Linking target samples/null 00:02:16.537 [32/36] Linking target test/unit_tests 00:02:16.537 [33/36] Linking target samples/shadow_ioeventfd_server 00:02:16.537 [34/36] Linking target samples/lspci 00:02:16.537 [35/36] Linking target samples/gpio-pci-idio-16 00:02:16.537 [36/36] Linking target samples/server 00:02:16.537 INFO: autodetecting backend as ninja 00:02:16.537 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:16.796 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:17.055 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:17.055 ninja: no work to do. 00:02:22.333 The Meson build system 00:02:22.333 Version: 1.3.1 00:02:22.333 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:02:22.333 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:02:22.333 Build type: native build 00:02:22.333 Program cat found: YES (/usr/bin/cat) 00:02:22.333 Project name: DPDK 00:02:22.333 Project version: 23.11.0 00:02:22.333 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:02:22.333 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:02:22.333 Host machine cpu family: x86_64 00:02:22.333 Host machine cpu: x86_64 00:02:22.333 Message: ## Building in Developer Mode ## 00:02:22.333 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:22.333 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:22.333 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:22.333 Program python3 found: YES (/usr/bin/python3) 00:02:22.333 Program cat found: YES (/usr/bin/cat) 00:02:22.333 Compiler for C supports arguments -march=native: YES 00:02:22.333 Checking for size of "void *" : 8 00:02:22.333 Checking for size of "void *" : 8 (cached) 00:02:22.333 Library m found: YES 00:02:22.333 Library numa found: YES 00:02:22.333 Has header "numaif.h" : YES 00:02:22.333 Library fdt found: NO 00:02:22.333 Library execinfo found: NO 00:02:22.333 Has header "execinfo.h" : YES 00:02:22.333 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:22.333 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:22.333 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:22.333 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:22.333 Run-time dependency openssl found: YES 3.0.9 00:02:22.333 Run-time dependency libpcap found: YES 1.10.4 00:02:22.333 Has header "pcap.h" with dependency libpcap: YES 00:02:22.333 Compiler for C supports arguments -Wcast-qual: YES 00:02:22.333 Compiler for C supports arguments -Wdeprecated: YES 00:02:22.333 Compiler for C supports arguments -Wformat: YES 00:02:22.333 Compiler for C supports arguments -Wformat-nonliteral: YES 00:02:22.333 Compiler for C supports arguments -Wformat-security: YES 00:02:22.333 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:22.333 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:22.333 Compiler for C supports arguments -Wnested-externs: YES 00:02:22.333 Compiler for C supports arguments -Wold-style-definition: YES 00:02:22.333 Compiler for C supports arguments -Wpointer-arith: YES 00:02:22.333 Compiler for C supports arguments -Wsign-compare: YES 00:02:22.333 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:22.333 Compiler for C supports arguments -Wundef: YES 00:02:22.333 Compiler for C supports arguments -Wwrite-strings: YES 00:02:22.333 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:22.333 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:02:22.333 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:22.333 Program objdump found: YES (/usr/bin/objdump) 00:02:22.333 Compiler for C supports arguments -mavx512f: YES 00:02:22.333 Checking if "AVX512 checking" compiles: YES 00:02:22.333 Fetching value of define "__SSE4_2__" : 1 00:02:22.333 Fetching value of define "__AES__" : 1 00:02:22.333 Fetching value of define "__AVX__" : 1 00:02:22.333 Fetching value of define "__AVX2__" : 1 00:02:22.333 Fetching value of define "__AVX512BW__" : 1 00:02:22.334 Fetching value of define "__AVX512CD__" : 1 00:02:22.334 Fetching value of define "__AVX512DQ__" : 1 00:02:22.334 Fetching value of define "__AVX512F__" : 1 00:02:22.334 Fetching value of define "__AVX512VL__" : 1 00:02:22.334 Fetching value of define "__PCLMUL__" : 1 00:02:22.334 Fetching value of define "__RDRND__" : 1 00:02:22.334 Fetching value of define "__RDSEED__" : 1 00:02:22.334 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:22.334 Fetching value of define "__znver1__" : (undefined) 00:02:22.334 Fetching value of define "__znver2__" : (undefined) 00:02:22.334 Fetching value of define "__znver3__" : (undefined) 00:02:22.334 Fetching value of define "__znver4__" : (undefined) 00:02:22.334 Compiler for C supports arguments -Wno-format-truncation: NO 00:02:22.334 Message: lib/log: Defining dependency "log" 00:02:22.334 Message: lib/kvargs: Defining dependency "kvargs" 00:02:22.334 Message: lib/telemetry: Defining dependency "telemetry" 00:02:22.334 Checking for function "getentropy" : NO 00:02:22.334 Message: lib/eal: Defining dependency "eal" 00:02:22.334 Message: lib/ring: Defining dependency "ring" 00:02:22.334 Message: lib/rcu: Defining dependency "rcu" 00:02:22.334 Message: lib/mempool: Defining dependency "mempool" 00:02:22.334 Message: lib/mbuf: Defining dependency "mbuf" 00:02:22.334 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:22.334 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:22.334 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:22.334 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:22.334 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:22.334 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:22.334 Compiler for C supports arguments -mpclmul: YES 00:02:22.334 Compiler for C supports arguments -maes: YES 00:02:22.334 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:22.334 Compiler for C supports arguments -mavx512bw: YES 00:02:22.334 Compiler for C supports arguments -mavx512dq: YES 00:02:22.334 Compiler for C supports arguments -mavx512vl: YES 00:02:22.334 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:22.334 Compiler for C supports arguments -mavx2: YES 00:02:22.334 Compiler for C supports arguments -mavx: YES 00:02:22.334 Message: lib/net: Defining dependency "net" 00:02:22.334 Message: lib/meter: Defining dependency "meter" 00:02:22.334 Message: lib/ethdev: Defining dependency "ethdev" 00:02:22.334 Message: lib/pci: Defining dependency "pci" 00:02:22.334 Message: lib/cmdline: Defining dependency "cmdline" 00:02:22.334 Message: lib/hash: Defining dependency "hash" 00:02:22.334 Message: lib/timer: Defining dependency "timer" 00:02:22.334 Message: lib/compressdev: Defining dependency "compressdev" 00:02:22.334 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:22.334 Message: lib/dmadev: Defining dependency "dmadev" 00:02:22.334 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:22.334 Message: lib/power: Defining dependency "power" 00:02:22.334 Message: lib/reorder: Defining dependency "reorder" 00:02:22.334 Message: lib/security: Defining dependency "security" 00:02:22.334 Has header "linux/userfaultfd.h" : YES 00:02:22.334 Has header "linux/vduse.h" : YES 00:02:22.334 Message: lib/vhost: Defining dependency "vhost" 00:02:22.334 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:02:22.334 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:22.334 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:22.334 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:22.334 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:22.334 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:22.334 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:22.334 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:22.334 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:22.334 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:22.334 Program doxygen found: YES (/usr/bin/doxygen) 00:02:22.334 Configuring doxy-api-html.conf using configuration 00:02:22.334 Configuring doxy-api-man.conf using configuration 00:02:22.334 Program mandb found: YES (/usr/bin/mandb) 00:02:22.334 Program sphinx-build found: NO 00:02:22.334 Configuring rte_build_config.h using configuration 00:02:22.334 Message: 00:02:22.334 ================= 00:02:22.334 Applications Enabled 00:02:22.334 ================= 00:02:22.334 00:02:22.334 apps: 00:02:22.334 00:02:22.334 00:02:22.334 Message: 00:02:22.334 ================= 00:02:22.334 Libraries Enabled 00:02:22.334 ================= 00:02:22.334 00:02:22.334 libs: 00:02:22.334 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:22.334 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:22.334 cryptodev, dmadev, power, reorder, security, vhost, 00:02:22.334 00:02:22.334 Message: 00:02:22.334 =============== 00:02:22.334 Drivers Enabled 00:02:22.334 =============== 00:02:22.334 00:02:22.334 common: 00:02:22.334 00:02:22.334 bus: 00:02:22.334 pci, vdev, 00:02:22.334 mempool: 00:02:22.334 ring, 00:02:22.334 dma: 00:02:22.334 00:02:22.334 net: 00:02:22.334 00:02:22.334 crypto: 00:02:22.334 00:02:22.334 compress: 00:02:22.334 00:02:22.334 vdpa: 00:02:22.334 00:02:22.334 00:02:22.334 Message: 00:02:22.334 ================= 00:02:22.334 Content Skipped 00:02:22.334 ================= 00:02:22.334 00:02:22.334 apps: 00:02:22.334 dumpcap: explicitly disabled via build config 00:02:22.334 graph: explicitly disabled via build config 00:02:22.334 pdump: explicitly disabled via build config 00:02:22.334 proc-info: explicitly disabled via build config 00:02:22.334 test-acl: explicitly disabled via build config 00:02:22.334 test-bbdev: explicitly disabled via build config 00:02:22.334 test-cmdline: explicitly disabled via build config 00:02:22.334 test-compress-perf: explicitly disabled via build config 00:02:22.334 test-crypto-perf: explicitly disabled via build config 00:02:22.334 test-dma-perf: explicitly disabled via build config 00:02:22.334 test-eventdev: explicitly disabled via build config 00:02:22.334 test-fib: explicitly disabled via build config 00:02:22.334 test-flow-perf: explicitly disabled via build config 00:02:22.334 test-gpudev: explicitly disabled via build config 00:02:22.334 test-mldev: explicitly disabled via build config 00:02:22.334 test-pipeline: explicitly disabled via build config 00:02:22.334 test-pmd: explicitly disabled via build config 00:02:22.334 test-regex: explicitly disabled via build config 00:02:22.334 test-sad: explicitly disabled via build config 00:02:22.334 test-security-perf: explicitly disabled via build config 00:02:22.334 00:02:22.334 libs: 00:02:22.334 metrics: explicitly disabled via build config 00:02:22.334 acl: explicitly disabled via build config 00:02:22.334 bbdev: explicitly disabled via build config 00:02:22.334 bitratestats: explicitly disabled via build config 00:02:22.334 bpf: explicitly disabled via build config 00:02:22.334 cfgfile: explicitly disabled via build config 00:02:22.334 distributor: explicitly disabled via build config 00:02:22.334 efd: explicitly disabled via build config 00:02:22.334 eventdev: explicitly disabled via build config 00:02:22.334 dispatcher: explicitly disabled via build config 00:02:22.334 gpudev: explicitly disabled via build config 00:02:22.334 gro: explicitly disabled via build config 00:02:22.334 gso: explicitly disabled via build config 00:02:22.334 ip_frag: explicitly disabled via build config 00:02:22.334 jobstats: explicitly disabled via build config 00:02:22.334 latencystats: explicitly disabled via build config 00:02:22.334 lpm: explicitly disabled via build config 00:02:22.334 member: explicitly disabled via build config 00:02:22.334 pcapng: explicitly disabled via build config 00:02:22.334 rawdev: explicitly disabled via build config 00:02:22.334 regexdev: explicitly disabled via build config 00:02:22.334 mldev: explicitly disabled via build config 00:02:22.334 rib: explicitly disabled via build config 00:02:22.334 sched: explicitly disabled via build config 00:02:22.334 stack: explicitly disabled via build config 00:02:22.334 ipsec: explicitly disabled via build config 00:02:22.334 pdcp: explicitly disabled via build config 00:02:22.334 fib: explicitly disabled via build config 00:02:22.334 port: explicitly disabled via build config 00:02:22.334 pdump: explicitly disabled via build config 00:02:22.334 table: explicitly disabled via build config 00:02:22.334 pipeline: explicitly disabled via build config 00:02:22.334 graph: explicitly disabled via build config 00:02:22.334 node: explicitly disabled via build config 00:02:22.334 00:02:22.334 drivers: 00:02:22.334 common/cpt: not in enabled drivers build config 00:02:22.334 common/dpaax: not in enabled drivers build config 00:02:22.334 common/iavf: not in enabled drivers build config 00:02:22.334 common/idpf: not in enabled drivers build config 00:02:22.334 common/mvep: not in enabled drivers build config 00:02:22.334 common/octeontx: not in enabled drivers build config 00:02:22.334 bus/auxiliary: not in enabled drivers build config 00:02:22.334 bus/cdx: not in enabled drivers build config 00:02:22.334 bus/dpaa: not in enabled drivers build config 00:02:22.334 bus/fslmc: not in enabled drivers build config 00:02:22.334 bus/ifpga: not in enabled drivers build config 00:02:22.334 bus/platform: not in enabled drivers build config 00:02:22.334 bus/vmbus: not in enabled drivers build config 00:02:22.334 common/cnxk: not in enabled drivers build config 00:02:22.334 common/mlx5: not in enabled drivers build config 00:02:22.334 common/nfp: not in enabled drivers build config 00:02:22.334 common/qat: not in enabled drivers build config 00:02:22.334 common/sfc_efx: not in enabled drivers build config 00:02:22.334 mempool/bucket: not in enabled drivers build config 00:02:22.334 mempool/cnxk: not in enabled drivers build config 00:02:22.334 mempool/dpaa: not in enabled drivers build config 00:02:22.334 mempool/dpaa2: not in enabled drivers build config 00:02:22.334 mempool/octeontx: not in enabled drivers build config 00:02:22.334 mempool/stack: not in enabled drivers build config 00:02:22.334 dma/cnxk: not in enabled drivers build config 00:02:22.334 dma/dpaa: not in enabled drivers build config 00:02:22.334 dma/dpaa2: not in enabled drivers build config 00:02:22.334 dma/hisilicon: not in enabled drivers build config 00:02:22.334 dma/idxd: not in enabled drivers build config 00:02:22.334 dma/ioat: not in enabled drivers build config 00:02:22.334 dma/skeleton: not in enabled drivers build config 00:02:22.334 net/af_packet: not in enabled drivers build config 00:02:22.334 net/af_xdp: not in enabled drivers build config 00:02:22.334 net/ark: not in enabled drivers build config 00:02:22.334 net/atlantic: not in enabled drivers build config 00:02:22.334 net/avp: not in enabled drivers build config 00:02:22.334 net/axgbe: not in enabled drivers build config 00:02:22.334 net/bnx2x: not in enabled drivers build config 00:02:22.335 net/bnxt: not in enabled drivers build config 00:02:22.335 net/bonding: not in enabled drivers build config 00:02:22.335 net/cnxk: not in enabled drivers build config 00:02:22.335 net/cpfl: not in enabled drivers build config 00:02:22.335 net/cxgbe: not in enabled drivers build config 00:02:22.335 net/dpaa: not in enabled drivers build config 00:02:22.335 net/dpaa2: not in enabled drivers build config 00:02:22.335 net/e1000: not in enabled drivers build config 00:02:22.335 net/ena: not in enabled drivers build config 00:02:22.335 net/enetc: not in enabled drivers build config 00:02:22.335 net/enetfec: not in enabled drivers build config 00:02:22.335 net/enic: not in enabled drivers build config 00:02:22.335 net/failsafe: not in enabled drivers build config 00:02:22.335 net/fm10k: not in enabled drivers build config 00:02:22.335 net/gve: not in enabled drivers build config 00:02:22.335 net/hinic: not in enabled drivers build config 00:02:22.335 net/hns3: not in enabled drivers build config 00:02:22.335 net/i40e: not in enabled drivers build config 00:02:22.335 net/iavf: not in enabled drivers build config 00:02:22.335 net/ice: not in enabled drivers build config 00:02:22.335 net/idpf: not in enabled drivers build config 00:02:22.335 net/igc: not in enabled drivers build config 00:02:22.335 net/ionic: not in enabled drivers build config 00:02:22.335 net/ipn3ke: not in enabled drivers build config 00:02:22.335 net/ixgbe: not in enabled drivers build config 00:02:22.335 net/mana: not in enabled drivers build config 00:02:22.335 net/memif: not in enabled drivers build config 00:02:22.335 net/mlx4: not in enabled drivers build config 00:02:22.335 net/mlx5: not in enabled drivers build config 00:02:22.335 net/mvneta: not in enabled drivers build config 00:02:22.335 net/mvpp2: not in enabled drivers build config 00:02:22.335 net/netvsc: not in enabled drivers build config 00:02:22.335 net/nfb: not in enabled drivers build config 00:02:22.335 net/nfp: not in enabled drivers build config 00:02:22.335 net/ngbe: not in enabled drivers build config 00:02:22.335 net/null: not in enabled drivers build config 00:02:22.335 net/octeontx: not in enabled drivers build config 00:02:22.335 net/octeon_ep: not in enabled drivers build config 00:02:22.335 net/pcap: not in enabled drivers build config 00:02:22.335 net/pfe: not in enabled drivers build config 00:02:22.335 net/qede: not in enabled drivers build config 00:02:22.335 net/ring: not in enabled drivers build config 00:02:22.335 net/sfc: not in enabled drivers build config 00:02:22.335 net/softnic: not in enabled drivers build config 00:02:22.335 net/tap: not in enabled drivers build config 00:02:22.335 net/thunderx: not in enabled drivers build config 00:02:22.335 net/txgbe: not in enabled drivers build config 00:02:22.335 net/vdev_netvsc: not in enabled drivers build config 00:02:22.335 net/vhost: not in enabled drivers build config 00:02:22.335 net/virtio: not in enabled drivers build config 00:02:22.335 net/vmxnet3: not in enabled drivers build config 00:02:22.335 raw/*: missing internal dependency, "rawdev" 00:02:22.335 crypto/armv8: not in enabled drivers build config 00:02:22.335 crypto/bcmfs: not in enabled drivers build config 00:02:22.335 crypto/caam_jr: not in enabled drivers build config 00:02:22.335 crypto/ccp: not in enabled drivers build config 00:02:22.335 crypto/cnxk: not in enabled drivers build config 00:02:22.335 crypto/dpaa_sec: not in enabled drivers build config 00:02:22.335 crypto/dpaa2_sec: not in enabled drivers build config 00:02:22.335 crypto/ipsec_mb: not in enabled drivers build config 00:02:22.335 crypto/mlx5: not in enabled drivers build config 00:02:22.335 crypto/mvsam: not in enabled drivers build config 00:02:22.335 crypto/nitrox: not in enabled drivers build config 00:02:22.335 crypto/null: not in enabled drivers build config 00:02:22.335 crypto/octeontx: not in enabled drivers build config 00:02:22.335 crypto/openssl: not in enabled drivers build config 00:02:22.335 crypto/scheduler: not in enabled drivers build config 00:02:22.335 crypto/uadk: not in enabled drivers build config 00:02:22.335 crypto/virtio: not in enabled drivers build config 00:02:22.335 compress/isal: not in enabled drivers build config 00:02:22.335 compress/mlx5: not in enabled drivers build config 00:02:22.335 compress/octeontx: not in enabled drivers build config 00:02:22.335 compress/zlib: not in enabled drivers build config 00:02:22.335 regex/*: missing internal dependency, "regexdev" 00:02:22.335 ml/*: missing internal dependency, "mldev" 00:02:22.335 vdpa/ifc: not in enabled drivers build config 00:02:22.335 vdpa/mlx5: not in enabled drivers build config 00:02:22.335 vdpa/nfp: not in enabled drivers build config 00:02:22.335 vdpa/sfc: not in enabled drivers build config 00:02:22.335 event/*: missing internal dependency, "eventdev" 00:02:22.335 baseband/*: missing internal dependency, "bbdev" 00:02:22.335 gpu/*: missing internal dependency, "gpudev" 00:02:22.335 00:02:22.335 00:02:22.335 Build targets in project: 85 00:02:22.335 00:02:22.335 DPDK 23.11.0 00:02:22.335 00:02:22.335 User defined options 00:02:22.335 buildtype : debug 00:02:22.335 default_library : static 00:02:22.335 libdir : lib 00:02:22.335 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:02:22.335 c_args : -fPIC -Werror 00:02:22.335 c_link_args : 00:02:22.335 cpu_instruction_set: native 00:02:22.335 disable_apps : test-fib,test-sad,test,test-regex,test-security-perf,test-bbdev,dumpcap,test-crypto-perf,test-flow-perf,test-gpudev,test-cmdline,test-dma-perf,test-eventdev,test-pipeline,test-acl,proc-info,test-compress-perf,graph,test-pmd,test-mldev,pdump 00:02:22.335 disable_libs : bbdev,latencystats,member,gpudev,mldev,pipeline,lpm,efd,regexdev,sched,node,dispatcher,table,bpf,port,gro,fib,cfgfile,ip_frag,gso,rawdev,ipsec,pdcp,rib,acl,metrics,graph,pcapng,jobstats,eventdev,stack,bitratestats,distributor,pdump 00:02:22.335 enable_docs : false 00:02:22.335 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:02:22.335 enable_kmods : false 00:02:22.335 tests : false 00:02:22.335 00:02:22.335 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:22.597 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:02:22.597 [1/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:22.597 [2/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:22.597 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:22.597 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:22.597 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:22.597 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:22.597 [7/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:22.597 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:22.597 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:22.597 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:22.597 [11/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:22.597 [12/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:22.597 [13/265] Linking static target lib/librte_kvargs.a 00:02:22.597 [14/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:22.597 [15/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:22.597 [16/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:22.597 [17/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:22.597 [18/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:22.597 [19/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:22.597 [20/265] Linking static target lib/librte_log.a 00:02:22.597 [21/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:22.598 [22/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:22.598 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:22.598 [24/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:22.598 [25/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:22.598 [26/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:22.598 [27/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:22.598 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:22.598 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:22.598 [30/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:22.598 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:22.598 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:22.598 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:22.598 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:22.598 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:22.859 [36/265] Linking static target lib/librte_pci.a 00:02:22.859 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:22.859 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:22.859 [39/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:22.859 [40/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:22.859 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:22.859 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.117 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.117 [44/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:23.117 [45/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:23.117 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:23.117 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:23.117 [48/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:23.117 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:23.117 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:23.117 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:23.117 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:23.117 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:23.117 [54/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:23.117 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:23.117 [56/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:23.117 [57/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:23.117 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:23.117 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:23.117 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:23.117 [61/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:23.117 [62/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:23.117 [63/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:23.117 [64/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:23.117 [65/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:23.117 [66/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:23.118 [67/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:23.118 [68/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:23.118 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:23.118 [70/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:23.118 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:23.118 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:23.118 [73/265] Linking static target lib/librte_telemetry.a 00:02:23.118 [74/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:23.118 [75/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:23.118 [76/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:23.118 [77/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:23.118 [78/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:23.118 [79/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:23.118 [80/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:23.118 [81/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:23.118 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:23.118 [83/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:23.118 [84/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:23.118 [85/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:23.118 [86/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:23.118 [87/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:23.118 [88/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:23.118 [89/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:23.118 [90/265] Linking static target lib/librte_meter.a 00:02:23.118 [91/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:23.118 [92/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:23.118 [93/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:23.118 [94/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:23.118 [95/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:23.118 [96/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:23.118 [97/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:23.118 [98/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:23.118 [99/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:23.118 [100/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:23.118 [101/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:23.118 [102/265] Linking static target lib/librte_ring.a 00:02:23.118 [103/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:23.118 [104/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:23.118 [105/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:23.118 [106/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:23.118 [107/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:23.118 [108/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:23.118 [109/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:23.118 [110/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:23.118 [111/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:23.118 [112/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:23.118 [113/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:23.118 [114/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:23.118 [115/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.118 [116/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:23.118 [117/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:23.376 [118/265] Linking static target lib/librte_timer.a 00:02:23.376 [119/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:23.376 [120/265] Linking static target lib/librte_cmdline.a 00:02:23.376 [121/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:23.376 [122/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:23.376 [123/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:23.376 [124/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:23.376 [125/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:23.376 [126/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:23.376 [127/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:23.376 [128/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:23.376 [129/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:23.376 [130/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:23.376 [131/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:23.376 [132/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:23.377 [133/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:23.377 [134/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:23.377 [135/265] Linking target lib/librte_log.so.24.0 00:02:23.377 [136/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:23.377 [137/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:23.377 [138/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:23.377 [139/265] Linking static target lib/librte_eal.a 00:02:23.377 [140/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:23.377 [141/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:23.377 [142/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:23.377 [143/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:23.377 [144/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:23.377 [145/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:23.377 [146/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:23.377 [147/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:23.377 [148/265] Linking static target lib/librte_compressdev.a 00:02:23.377 [149/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:23.377 [150/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:23.377 [151/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:23.377 [152/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:23.377 [153/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:23.377 [154/265] Linking static target lib/librte_mbuf.a 00:02:23.377 [155/265] Linking static target lib/librte_reorder.a 00:02:23.377 [156/265] Linking static target lib/librte_net.a 00:02:23.377 [157/265] Linking static target lib/librte_power.a 00:02:23.377 [158/265] Linking static target lib/librte_mempool.a 00:02:23.377 [159/265] Linking static target lib/librte_rcu.a 00:02:23.377 [160/265] Linking static target lib/librte_dmadev.a 00:02:23.377 [161/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:23.377 [162/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:23.377 [163/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:23.377 [164/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:23.377 [165/265] Linking static target lib/librte_security.a 00:02:23.377 [166/265] Linking static target lib/librte_hash.a 00:02:23.377 [167/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:23.377 [168/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:23.377 [169/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:23.377 [170/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:23.377 [171/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.377 [172/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:23.377 [173/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:23.377 [174/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:23.377 [175/265] Linking target lib/librte_kvargs.so.24.0 00:02:23.636 [176/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:23.636 [177/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:23.636 [178/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:23.636 [179/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:23.636 [180/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:23.636 [181/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:23.636 [182/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:23.636 [183/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.636 [184/265] Linking static target lib/librte_cryptodev.a 00:02:23.636 [185/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:23.636 [186/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:23.636 [187/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:23.636 [188/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:23.636 [189/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:23.636 [190/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:23.636 [191/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:23.636 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:23.636 [193/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.636 [194/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.636 [195/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:23.636 [196/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.636 [197/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.636 [198/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:23.636 [199/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:23.636 [200/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:23.896 [201/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.896 [202/265] Linking target lib/librte_telemetry.so.24.0 00:02:23.896 [203/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:23.896 [204/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:23.896 [205/265] Linking static target drivers/librte_bus_vdev.a 00:02:23.896 [206/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:23.896 [207/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:23.896 [208/265] Linking static target drivers/librte_bus_pci.a 00:02:23.896 [209/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:23.896 [210/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:23.896 [211/265] Linking static target drivers/librte_mempool_ring.a 00:02:23.896 [212/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:23.896 [213/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.896 [214/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:23.896 [215/265] Linking static target lib/librte_ethdev.a 00:02:24.155 [216/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.156 [217/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.156 [218/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.156 [219/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.414 [220/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.414 [221/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.414 [222/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.414 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:24.414 [224/265] Linking static target lib/librte_vhost.a 00:02:24.672 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.672 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.609 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.547 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.113 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.401 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.401 [231/265] Linking target lib/librte_eal.so.24.0 00:02:36.401 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:36.401 [233/265] Linking target lib/librte_pci.so.24.0 00:02:36.401 [234/265] Linking target lib/librte_timer.so.24.0 00:02:36.401 [235/265] Linking target lib/librte_meter.so.24.0 00:02:36.401 [236/265] Linking target lib/librte_ring.so.24.0 00:02:36.401 [237/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:36.401 [238/265] Linking target lib/librte_dmadev.so.24.0 00:02:36.401 [239/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:36.401 [240/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:36.401 [241/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:36.401 [242/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:36.401 [243/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:36.401 [244/265] Linking target lib/librte_rcu.so.24.0 00:02:36.660 [245/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:36.660 [246/265] Linking target lib/librte_mempool.so.24.0 00:02:36.660 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:36.660 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:36.660 [249/265] Linking target lib/librte_mbuf.so.24.0 00:02:36.660 [250/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:36.918 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:36.918 [252/265] Linking target lib/librte_net.so.24.0 00:02:36.918 [253/265] Linking target lib/librte_reorder.so.24.0 00:02:36.918 [254/265] Linking target lib/librte_compressdev.so.24.0 00:02:36.918 [255/265] Linking target lib/librte_cryptodev.so.24.0 00:02:37.178 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:37.178 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:37.178 [258/265] Linking target lib/librte_hash.so.24.0 00:02:37.178 [259/265] Linking target lib/librte_ethdev.so.24.0 00:02:37.178 [260/265] Linking target lib/librte_cmdline.so.24.0 00:02:37.178 [261/265] Linking target lib/librte_security.so.24.0 00:02:37.178 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:37.178 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:37.437 [264/265] Linking target lib/librte_power.so.24.0 00:02:37.437 [265/265] Linking target lib/librte_vhost.so.24.0 00:02:37.437 INFO: autodetecting backend as ninja 00:02:37.437 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:02:38.375 CC lib/ut_mock/mock.o 00:02:38.375 CC lib/log/log.o 00:02:38.375 CC lib/log/log_flags.o 00:02:38.375 CC lib/ut/ut.o 00:02:38.375 CC lib/log/log_deprecated.o 00:02:38.375 LIB libspdk_ut_mock.a 00:02:38.375 LIB libspdk_ut.a 00:02:38.375 LIB libspdk_log.a 00:02:38.634 CXX lib/trace_parser/trace.o 00:02:38.893 CC lib/dma/dma.o 00:02:38.893 CC lib/ioat/ioat.o 00:02:38.893 CC lib/util/base64.o 00:02:38.893 CC lib/util/bit_array.o 00:02:38.893 CC lib/util/crc32.o 00:02:38.893 CC lib/util/cpuset.o 00:02:38.893 CC lib/util/crc16.o 00:02:38.893 CC lib/util/crc64.o 00:02:38.893 CC lib/util/crc32c.o 00:02:38.893 CC lib/util/crc32_ieee.o 00:02:38.893 CC lib/util/dif.o 00:02:38.893 CC lib/util/hexlify.o 00:02:38.893 CC lib/util/fd.o 00:02:38.893 CC lib/util/file.o 00:02:38.893 CC lib/util/iov.o 00:02:38.893 CC lib/util/math.o 00:02:38.893 CC lib/util/pipe.o 00:02:38.893 CC lib/util/strerror_tls.o 00:02:38.893 CC lib/util/string.o 00:02:38.893 CC lib/util/uuid.o 00:02:38.893 CC lib/util/fd_group.o 00:02:38.893 CC lib/util/xor.o 00:02:38.893 CC lib/util/zipf.o 00:02:38.893 LIB libspdk_dma.a 00:02:38.893 CC lib/vfio_user/host/vfio_user_pci.o 00:02:38.893 CC lib/vfio_user/host/vfio_user.o 00:02:38.893 LIB libspdk_ioat.a 00:02:39.152 LIB libspdk_vfio_user.a 00:02:39.152 LIB libspdk_util.a 00:02:39.152 LIB libspdk_trace_parser.a 00:02:39.412 CC lib/conf/conf.o 00:02:39.412 CC lib/json/json_parse.o 00:02:39.412 CC lib/json/json_util.o 00:02:39.412 CC lib/json/json_write.o 00:02:39.412 CC lib/rdma/common.o 00:02:39.412 CC lib/rdma/rdma_verbs.o 00:02:39.412 CC lib/idxd/idxd.o 00:02:39.412 CC lib/idxd/idxd_kernel.o 00:02:39.412 CC lib/idxd/idxd_user.o 00:02:39.412 CC lib/vmd/vmd.o 00:02:39.412 CC lib/vmd/led.o 00:02:39.412 CC lib/env_dpdk/env.o 00:02:39.412 CC lib/env_dpdk/memory.o 00:02:39.412 CC lib/env_dpdk/pci.o 00:02:39.412 CC lib/env_dpdk/init.o 00:02:39.412 CC lib/env_dpdk/threads.o 00:02:39.412 CC lib/env_dpdk/pci_ioat.o 00:02:39.412 CC lib/env_dpdk/pci_virtio.o 00:02:39.412 CC lib/env_dpdk/pci_vmd.o 00:02:39.412 CC lib/env_dpdk/pci_idxd.o 00:02:39.412 CC lib/env_dpdk/pci_event.o 00:02:39.412 CC lib/env_dpdk/sigbus_handler.o 00:02:39.412 CC lib/env_dpdk/pci_dpdk.o 00:02:39.412 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:39.412 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:39.672 LIB libspdk_conf.a 00:02:39.672 LIB libspdk_rdma.a 00:02:39.672 LIB libspdk_json.a 00:02:39.672 LIB libspdk_idxd.a 00:02:39.931 LIB libspdk_vmd.a 00:02:39.931 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:39.932 CC lib/jsonrpc/jsonrpc_server.o 00:02:39.932 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:39.932 CC lib/jsonrpc/jsonrpc_client.o 00:02:39.932 LIB libspdk_jsonrpc.a 00:02:40.501 LIB libspdk_env_dpdk.a 00:02:40.501 CC lib/rpc/rpc.o 00:02:40.501 LIB libspdk_rpc.a 00:02:40.760 CC lib/sock/sock.o 00:02:40.760 CC lib/notify/notify.o 00:02:40.760 CC lib/notify/notify_rpc.o 00:02:40.760 CC lib/trace/trace.o 00:02:40.760 CC lib/sock/sock_rpc.o 00:02:40.760 CC lib/trace/trace_rpc.o 00:02:40.760 CC lib/trace/trace_flags.o 00:02:41.020 LIB libspdk_notify.a 00:02:41.020 LIB libspdk_trace.a 00:02:41.020 LIB libspdk_sock.a 00:02:41.279 CC lib/thread/iobuf.o 00:02:41.279 CC lib/thread/thread.o 00:02:41.538 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:41.538 CC lib/nvme/nvme_ctrlr.o 00:02:41.538 CC lib/nvme/nvme_fabric.o 00:02:41.538 CC lib/nvme/nvme_ns_cmd.o 00:02:41.538 CC lib/nvme/nvme_ns.o 00:02:41.538 CC lib/nvme/nvme_pcie_common.o 00:02:41.538 CC lib/nvme/nvme_pcie.o 00:02:41.538 CC lib/nvme/nvme_qpair.o 00:02:41.538 CC lib/nvme/nvme.o 00:02:41.538 CC lib/nvme/nvme_quirks.o 00:02:41.538 CC lib/nvme/nvme_transport.o 00:02:41.538 CC lib/nvme/nvme_discovery.o 00:02:41.538 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:41.538 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:41.538 CC lib/nvme/nvme_tcp.o 00:02:41.538 CC lib/nvme/nvme_opal.o 00:02:41.538 CC lib/nvme/nvme_io_msg.o 00:02:41.538 CC lib/nvme/nvme_poll_group.o 00:02:41.538 CC lib/nvme/nvme_zns.o 00:02:41.538 CC lib/nvme/nvme_cuse.o 00:02:41.538 CC lib/nvme/nvme_vfio_user.o 00:02:41.538 CC lib/nvme/nvme_rdma.o 00:02:42.106 LIB libspdk_thread.a 00:02:42.365 CC lib/accel/accel.o 00:02:42.365 CC lib/accel/accel_sw.o 00:02:42.365 CC lib/accel/accel_rpc.o 00:02:42.365 CC lib/vfu_tgt/tgt_endpoint.o 00:02:42.365 CC lib/vfu_tgt/tgt_rpc.o 00:02:42.366 CC lib/init/json_config.o 00:02:42.366 CC lib/init/subsystem.o 00:02:42.366 CC lib/init/subsystem_rpc.o 00:02:42.366 CC lib/init/rpc.o 00:02:42.366 CC lib/virtio/virtio_vhost_user.o 00:02:42.366 CC lib/virtio/virtio.o 00:02:42.366 CC lib/virtio/virtio_vfio_user.o 00:02:42.366 CC lib/virtio/virtio_pci.o 00:02:42.366 CC lib/blob/blobstore.o 00:02:42.366 CC lib/blob/request.o 00:02:42.366 CC lib/blob/zeroes.o 00:02:42.366 CC lib/blob/blob_bs_dev.o 00:02:42.625 LIB libspdk_init.a 00:02:42.625 LIB libspdk_vfu_tgt.a 00:02:42.625 LIB libspdk_virtio.a 00:02:42.625 LIB libspdk_nvme.a 00:02:42.884 CC lib/event/app.o 00:02:42.884 CC lib/event/reactor.o 00:02:42.884 CC lib/event/log_rpc.o 00:02:42.884 CC lib/event/app_rpc.o 00:02:42.884 CC lib/event/scheduler_static.o 00:02:43.144 LIB libspdk_accel.a 00:02:43.144 LIB libspdk_event.a 00:02:43.403 CC lib/bdev/bdev.o 00:02:43.403 CC lib/bdev/bdev_rpc.o 00:02:43.403 CC lib/bdev/bdev_zone.o 00:02:43.403 CC lib/bdev/part.o 00:02:43.403 CC lib/bdev/scsi_nvme.o 00:02:43.972 LIB libspdk_blob.a 00:02:44.236 CC lib/blobfs/blobfs.o 00:02:44.236 CC lib/blobfs/tree.o 00:02:44.236 CC lib/lvol/lvol.o 00:02:44.806 LIB libspdk_lvol.a 00:02:44.806 LIB libspdk_blobfs.a 00:02:45.065 LIB libspdk_bdev.a 00:02:45.323 CC lib/scsi/lun.o 00:02:45.323 CC lib/scsi/port.o 00:02:45.323 CC lib/scsi/dev.o 00:02:45.323 CC lib/scsi/scsi.o 00:02:45.323 CC lib/scsi/scsi_bdev.o 00:02:45.323 CC lib/scsi/scsi_pr.o 00:02:45.323 CC lib/scsi/task.o 00:02:45.323 CC lib/scsi/scsi_rpc.o 00:02:45.323 CC lib/ftl/ftl_core.o 00:02:45.323 CC lib/ftl/ftl_init.o 00:02:45.323 CC lib/ftl/ftl_io.o 00:02:45.323 CC lib/ftl/ftl_layout.o 00:02:45.323 CC lib/ftl/ftl_debug.o 00:02:45.323 CC lib/ftl/ftl_sb.o 00:02:45.323 CC lib/ftl/ftl_l2p.o 00:02:45.323 CC lib/ftl/ftl_l2p_flat.o 00:02:45.323 CC lib/ftl/ftl_nv_cache.o 00:02:45.323 CC lib/ftl/ftl_band.o 00:02:45.323 CC lib/ftl/ftl_writer.o 00:02:45.324 CC lib/ftl/ftl_band_ops.o 00:02:45.324 CC lib/ftl/ftl_rq.o 00:02:45.324 CC lib/ftl/ftl_reloc.o 00:02:45.324 CC lib/ftl/ftl_l2p_cache.o 00:02:45.324 CC lib/ftl/ftl_p2l.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:45.324 CC lib/nvmf/ctrlr.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:45.324 CC lib/nvmf/ctrlr_discovery.o 00:02:45.324 CC lib/nvmf/ctrlr_bdev.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:45.324 CC lib/nvmf/subsystem.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:45.324 CC lib/nvmf/nvmf.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:45.324 CC lib/nvmf/nvmf_rpc.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:45.324 CC lib/nvmf/transport.o 00:02:45.324 CC lib/ublk/ublk.o 00:02:45.324 CC lib/nvmf/vfio_user.o 00:02:45.324 CC lib/nvmf/tcp.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:45.324 CC lib/ublk/ublk_rpc.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:45.324 CC lib/ftl/utils/ftl_conf.o 00:02:45.324 CC lib/nvmf/rdma.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:45.324 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:45.324 CC lib/ftl/utils/ftl_bitmap.o 00:02:45.324 CC lib/ftl/utils/ftl_md.o 00:02:45.324 CC lib/ftl/utils/ftl_mempool.o 00:02:45.324 CC lib/nbd/nbd.o 00:02:45.324 CC lib/nbd/nbd_rpc.o 00:02:45.324 CC lib/ftl/utils/ftl_property.o 00:02:45.324 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:45.324 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:45.324 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:45.324 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:45.324 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:45.324 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:45.324 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:45.324 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:45.324 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:45.324 CC lib/ftl/base/ftl_base_dev.o 00:02:45.324 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:45.324 CC lib/ftl/base/ftl_base_bdev.o 00:02:45.324 CC lib/ftl/ftl_trace.o 00:02:45.582 LIB libspdk_scsi.a 00:02:45.842 LIB libspdk_nbd.a 00:02:45.842 LIB libspdk_ublk.a 00:02:45.842 LIB libspdk_ftl.a 00:02:46.100 CC lib/vhost/vhost.o 00:02:46.101 CC lib/vhost/vhost_rpc.o 00:02:46.101 CC lib/vhost/vhost_scsi.o 00:02:46.101 CC lib/vhost/vhost_blk.o 00:02:46.101 CC lib/vhost/rte_vhost_user.o 00:02:46.101 CC lib/iscsi/conn.o 00:02:46.101 CC lib/iscsi/init_grp.o 00:02:46.101 CC lib/iscsi/iscsi.o 00:02:46.101 CC lib/iscsi/param.o 00:02:46.101 CC lib/iscsi/md5.o 00:02:46.101 CC lib/iscsi/portal_grp.o 00:02:46.101 CC lib/iscsi/tgt_node.o 00:02:46.101 CC lib/iscsi/iscsi_subsystem.o 00:02:46.101 CC lib/iscsi/iscsi_rpc.o 00:02:46.101 CC lib/iscsi/task.o 00:02:46.669 LIB libspdk_nvmf.a 00:02:46.669 LIB libspdk_vhost.a 00:02:46.669 LIB libspdk_iscsi.a 00:02:47.237 CC module/env_dpdk/env_dpdk_rpc.o 00:02:47.237 CC module/vfu_device/vfu_virtio.o 00:02:47.237 CC module/vfu_device/vfu_virtio_blk.o 00:02:47.237 CC module/vfu_device/vfu_virtio_scsi.o 00:02:47.237 CC module/vfu_device/vfu_virtio_rpc.o 00:02:47.237 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:47.237 CC module/accel/iaa/accel_iaa_rpc.o 00:02:47.237 CC module/accel/iaa/accel_iaa.o 00:02:47.237 LIB libspdk_env_dpdk_rpc.a 00:02:47.496 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:47.496 CC module/accel/dsa/accel_dsa.o 00:02:47.496 CC module/accel/dsa/accel_dsa_rpc.o 00:02:47.496 CC module/scheduler/gscheduler/gscheduler.o 00:02:47.496 CC module/blob/bdev/blob_bdev.o 00:02:47.496 CC module/accel/error/accel_error_rpc.o 00:02:47.496 CC module/accel/error/accel_error.o 00:02:47.496 CC module/sock/posix/posix.o 00:02:47.496 CC module/accel/ioat/accel_ioat.o 00:02:47.496 CC module/accel/ioat/accel_ioat_rpc.o 00:02:47.496 LIB libspdk_scheduler_dynamic.a 00:02:47.496 LIB libspdk_scheduler_dpdk_governor.a 00:02:47.496 LIB libspdk_accel_iaa.a 00:02:47.496 LIB libspdk_scheduler_gscheduler.a 00:02:47.496 LIB libspdk_accel_error.a 00:02:47.496 LIB libspdk_accel_ioat.a 00:02:47.496 LIB libspdk_blob_bdev.a 00:02:47.496 LIB libspdk_accel_dsa.a 00:02:47.754 LIB libspdk_vfu_device.a 00:02:47.754 LIB libspdk_sock_posix.a 00:02:48.012 CC module/bdev/malloc/bdev_malloc.o 00:02:48.012 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:48.012 CC module/bdev/delay/vbdev_delay.o 00:02:48.012 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:48.012 CC module/bdev/lvol/vbdev_lvol.o 00:02:48.012 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:48.012 CC module/bdev/nvme/bdev_nvme.o 00:02:48.012 CC module/bdev/null/bdev_null.o 00:02:48.012 CC module/bdev/nvme/bdev_mdns_client.o 00:02:48.012 CC module/bdev/nvme/vbdev_opal.o 00:02:48.012 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:48.012 CC module/bdev/nvme/nvme_rpc.o 00:02:48.012 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:48.012 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:48.012 CC module/bdev/null/bdev_null_rpc.o 00:02:48.012 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:48.012 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:48.012 CC module/bdev/split/vbdev_split.o 00:02:48.012 CC module/blobfs/bdev/blobfs_bdev.o 00:02:48.012 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:48.012 CC module/bdev/split/vbdev_split_rpc.o 00:02:48.012 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:48.012 CC module/bdev/aio/bdev_aio.o 00:02:48.012 CC module/bdev/ftl/bdev_ftl.o 00:02:48.012 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:48.012 CC module/bdev/aio/bdev_aio_rpc.o 00:02:48.012 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:48.012 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:48.012 CC module/bdev/gpt/gpt.o 00:02:48.012 CC module/bdev/gpt/vbdev_gpt.o 00:02:48.012 CC module/bdev/passthru/vbdev_passthru.o 00:02:48.012 CC module/bdev/error/vbdev_error_rpc.o 00:02:48.012 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:48.012 CC module/bdev/error/vbdev_error.o 00:02:48.012 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:48.012 CC module/bdev/iscsi/bdev_iscsi.o 00:02:48.012 CC module/bdev/raid/bdev_raid.o 00:02:48.012 CC module/bdev/raid/bdev_raid_sb.o 00:02:48.012 CC module/bdev/raid/bdev_raid_rpc.o 00:02:48.012 CC module/bdev/raid/raid0.o 00:02:48.012 CC module/bdev/raid/raid1.o 00:02:48.012 CC module/bdev/raid/concat.o 00:02:48.012 LIB libspdk_blobfs_bdev.a 00:02:48.270 LIB libspdk_bdev_split.a 00:02:48.270 LIB libspdk_bdev_null.a 00:02:48.270 LIB libspdk_bdev_error.a 00:02:48.270 LIB libspdk_bdev_gpt.a 00:02:48.270 LIB libspdk_bdev_ftl.a 00:02:48.270 LIB libspdk_bdev_aio.a 00:02:48.270 LIB libspdk_bdev_passthru.a 00:02:48.270 LIB libspdk_bdev_zone_block.a 00:02:48.270 LIB libspdk_bdev_malloc.a 00:02:48.270 LIB libspdk_bdev_delay.a 00:02:48.270 LIB libspdk_bdev_iscsi.a 00:02:48.270 LIB libspdk_bdev_lvol.a 00:02:48.270 LIB libspdk_bdev_virtio.a 00:02:48.527 LIB libspdk_bdev_raid.a 00:02:49.099 LIB libspdk_bdev_nvme.a 00:02:49.667 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:49.926 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:49.926 CC module/event/subsystems/vmd/vmd.o 00:02:49.926 CC module/event/subsystems/iobuf/iobuf.o 00:02:49.926 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:49.926 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:49.926 CC module/event/subsystems/sock/sock.o 00:02:49.926 CC module/event/subsystems/scheduler/scheduler.o 00:02:49.926 LIB libspdk_event_vfu_tgt.a 00:02:49.926 LIB libspdk_event_vmd.a 00:02:49.926 LIB libspdk_event_sock.a 00:02:49.926 LIB libspdk_event_vhost_blk.a 00:02:49.926 LIB libspdk_event_scheduler.a 00:02:49.926 LIB libspdk_event_iobuf.a 00:02:50.185 CC module/event/subsystems/accel/accel.o 00:02:50.444 LIB libspdk_event_accel.a 00:02:50.702 CC module/event/subsystems/bdev/bdev.o 00:02:50.702 LIB libspdk_event_bdev.a 00:02:51.269 CC module/event/subsystems/scsi/scsi.o 00:02:51.269 CC module/event/subsystems/ublk/ublk.o 00:02:51.269 CC module/event/subsystems/nbd/nbd.o 00:02:51.269 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:51.269 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:51.269 LIB libspdk_event_ublk.a 00:02:51.269 LIB libspdk_event_nbd.a 00:02:51.269 LIB libspdk_event_scsi.a 00:02:51.269 LIB libspdk_event_nvmf.a 00:02:51.527 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:51.527 CC module/event/subsystems/iscsi/iscsi.o 00:02:51.785 LIB libspdk_event_vhost_scsi.a 00:02:51.785 LIB libspdk_event_iscsi.a 00:02:52.048 CC app/spdk_nvme_discover/discovery_aer.o 00:02:52.048 CC app/spdk_nvme_identify/identify.o 00:02:52.048 CC app/spdk_lspci/spdk_lspci.o 00:02:52.048 CXX app/trace/trace.o 00:02:52.048 CC app/spdk_top/spdk_top.o 00:02:52.048 CC app/trace_record/trace_record.o 00:02:52.048 CC app/spdk_nvme_perf/perf.o 00:02:52.048 CC test/rpc_client/rpc_client_test.o 00:02:52.048 CC app/spdk_dd/spdk_dd.o 00:02:52.048 TEST_HEADER include/spdk/accel.h 00:02:52.048 CC app/nvmf_tgt/nvmf_main.o 00:02:52.048 TEST_HEADER include/spdk/accel_module.h 00:02:52.048 CC app/iscsi_tgt/iscsi_tgt.o 00:02:52.048 TEST_HEADER include/spdk/barrier.h 00:02:52.048 TEST_HEADER include/spdk/base64.h 00:02:52.048 TEST_HEADER include/spdk/bdev.h 00:02:52.048 TEST_HEADER include/spdk/assert.h 00:02:52.048 TEST_HEADER include/spdk/bdev_zone.h 00:02:52.048 TEST_HEADER include/spdk/bdev_module.h 00:02:52.048 TEST_HEADER include/spdk/bit_pool.h 00:02:52.048 TEST_HEADER include/spdk/blob_bdev.h 00:02:52.048 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:52.048 TEST_HEADER include/spdk/bit_array.h 00:02:52.048 TEST_HEADER include/spdk/blobfs.h 00:02:52.048 TEST_HEADER include/spdk/conf.h 00:02:52.048 TEST_HEADER include/spdk/blob.h 00:02:52.048 TEST_HEADER include/spdk/config.h 00:02:52.048 TEST_HEADER include/spdk/cpuset.h 00:02:52.048 TEST_HEADER include/spdk/crc16.h 00:02:52.048 TEST_HEADER include/spdk/crc32.h 00:02:52.048 TEST_HEADER include/spdk/dif.h 00:02:52.048 TEST_HEADER include/spdk/crc64.h 00:02:52.048 TEST_HEADER include/spdk/endian.h 00:02:52.048 TEST_HEADER include/spdk/env_dpdk.h 00:02:52.048 TEST_HEADER include/spdk/dma.h 00:02:52.048 TEST_HEADER include/spdk/event.h 00:02:52.048 TEST_HEADER include/spdk/env.h 00:02:52.048 CC app/vhost/vhost.o 00:02:52.048 CC app/spdk_tgt/spdk_tgt.o 00:02:52.048 TEST_HEADER include/spdk/fd_group.h 00:02:52.048 TEST_HEADER include/spdk/fd.h 00:02:52.048 TEST_HEADER include/spdk/file.h 00:02:52.048 TEST_HEADER include/spdk/ftl.h 00:02:52.048 TEST_HEADER include/spdk/gpt_spec.h 00:02:52.048 TEST_HEADER include/spdk/hexlify.h 00:02:52.048 TEST_HEADER include/spdk/histogram_data.h 00:02:52.048 TEST_HEADER include/spdk/idxd.h 00:02:52.048 TEST_HEADER include/spdk/idxd_spec.h 00:02:52.048 TEST_HEADER include/spdk/init.h 00:02:52.048 TEST_HEADER include/spdk/ioat.h 00:02:52.048 TEST_HEADER include/spdk/ioat_spec.h 00:02:52.048 TEST_HEADER include/spdk/json.h 00:02:52.048 TEST_HEADER include/spdk/iscsi_spec.h 00:02:52.048 TEST_HEADER include/spdk/jsonrpc.h 00:02:52.048 TEST_HEADER include/spdk/likely.h 00:02:52.048 TEST_HEADER include/spdk/log.h 00:02:52.048 TEST_HEADER include/spdk/lvol.h 00:02:52.048 TEST_HEADER include/spdk/memory.h 00:02:52.048 TEST_HEADER include/spdk/nbd.h 00:02:52.048 TEST_HEADER include/spdk/mmio.h 00:02:52.048 TEST_HEADER include/spdk/notify.h 00:02:52.048 TEST_HEADER include/spdk/nvme_intel.h 00:02:52.048 TEST_HEADER include/spdk/nvme.h 00:02:52.048 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:52.048 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:52.048 TEST_HEADER include/spdk/nvme_spec.h 00:02:52.048 TEST_HEADER include/spdk/nvme_zns.h 00:02:52.048 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:52.048 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:52.048 TEST_HEADER include/spdk/nvmf.h 00:02:52.048 TEST_HEADER include/spdk/nvmf_spec.h 00:02:52.048 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:52.048 TEST_HEADER include/spdk/nvmf_transport.h 00:02:52.048 TEST_HEADER include/spdk/opal.h 00:02:52.048 TEST_HEADER include/spdk/opal_spec.h 00:02:52.048 TEST_HEADER include/spdk/pci_ids.h 00:02:52.048 TEST_HEADER include/spdk/pipe.h 00:02:52.048 TEST_HEADER include/spdk/queue.h 00:02:52.048 TEST_HEADER include/spdk/reduce.h 00:02:52.048 TEST_HEADER include/spdk/rpc.h 00:02:52.048 TEST_HEADER include/spdk/scheduler.h 00:02:52.048 TEST_HEADER include/spdk/scsi.h 00:02:52.048 TEST_HEADER include/spdk/scsi_spec.h 00:02:52.048 TEST_HEADER include/spdk/sock.h 00:02:52.048 TEST_HEADER include/spdk/stdinc.h 00:02:52.048 TEST_HEADER include/spdk/string.h 00:02:52.048 TEST_HEADER include/spdk/trace.h 00:02:52.048 TEST_HEADER include/spdk/thread.h 00:02:52.048 TEST_HEADER include/spdk/trace_parser.h 00:02:52.048 TEST_HEADER include/spdk/tree.h 00:02:52.048 TEST_HEADER include/spdk/util.h 00:02:52.048 TEST_HEADER include/spdk/ublk.h 00:02:52.048 TEST_HEADER include/spdk/uuid.h 00:02:52.048 TEST_HEADER include/spdk/version.h 00:02:52.048 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:52.048 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:52.048 TEST_HEADER include/spdk/vhost.h 00:02:52.048 TEST_HEADER include/spdk/vmd.h 00:02:52.048 TEST_HEADER include/spdk/xor.h 00:02:52.048 TEST_HEADER include/spdk/zipf.h 00:02:52.048 CXX test/cpp_headers/accel.o 00:02:52.048 CXX test/cpp_headers/accel_module.o 00:02:52.048 CXX test/cpp_headers/assert.o 00:02:52.048 CXX test/cpp_headers/barrier.o 00:02:52.048 CXX test/cpp_headers/bdev.o 00:02:52.048 CXX test/cpp_headers/base64.o 00:02:52.048 CXX test/cpp_headers/bdev_module.o 00:02:52.048 CXX test/cpp_headers/bdev_zone.o 00:02:52.048 CXX test/cpp_headers/bit_array.o 00:02:52.048 CXX test/cpp_headers/bit_pool.o 00:02:52.048 CXX test/cpp_headers/blobfs_bdev.o 00:02:52.048 CXX test/cpp_headers/blob_bdev.o 00:02:52.048 CXX test/cpp_headers/blob.o 00:02:52.048 CXX test/cpp_headers/blobfs.o 00:02:52.048 CXX test/cpp_headers/conf.o 00:02:52.049 CXX test/cpp_headers/cpuset.o 00:02:52.049 CC app/fio/nvme/fio_plugin.o 00:02:52.049 CXX test/cpp_headers/config.o 00:02:52.049 LINK spdk_lspci 00:02:52.049 CXX test/cpp_headers/crc16.o 00:02:52.049 CXX test/cpp_headers/crc32.o 00:02:52.049 CXX test/cpp_headers/crc64.o 00:02:52.049 CXX test/cpp_headers/dif.o 00:02:52.049 CXX test/cpp_headers/dma.o 00:02:52.049 CXX test/cpp_headers/endian.o 00:02:52.049 CXX test/cpp_headers/env_dpdk.o 00:02:52.049 CXX test/cpp_headers/env.o 00:02:52.049 CXX test/cpp_headers/fd_group.o 00:02:52.049 CXX test/cpp_headers/event.o 00:02:52.049 CXX test/cpp_headers/fd.o 00:02:52.049 CXX test/cpp_headers/file.o 00:02:52.049 CXX test/cpp_headers/ftl.o 00:02:52.049 CXX test/cpp_headers/gpt_spec.o 00:02:52.049 CXX test/cpp_headers/hexlify.o 00:02:52.049 CC examples/vmd/led/led.o 00:02:52.049 CXX test/cpp_headers/histogram_data.o 00:02:52.049 CXX test/cpp_headers/idxd.o 00:02:52.049 CXX test/cpp_headers/idxd_spec.o 00:02:52.049 CXX test/cpp_headers/init.o 00:02:52.049 CC test/app/jsoncat/jsoncat.o 00:02:52.049 CC test/app/histogram_perf/histogram_perf.o 00:02:52.049 CC examples/vmd/lsvmd/lsvmd.o 00:02:52.049 CC test/app/stub/stub.o 00:02:52.049 CC examples/nvme/arbitration/arbitration.o 00:02:52.049 CC examples/nvme/hello_world/hello_world.o 00:02:52.049 CC examples/nvme/hotplug/hotplug.o 00:02:52.049 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:52.049 CC test/event/reactor/reactor.o 00:02:52.049 CC examples/nvme/reconnect/reconnect.o 00:02:52.049 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:52.049 CC test/nvme/overhead/overhead.o 00:02:52.310 CC test/event/reactor_perf/reactor_perf.o 00:02:52.310 CC test/thread/lock/spdk_lock.o 00:02:52.310 CC test/thread/poller_perf/poller_perf.o 00:02:52.310 CC test/nvme/reserve/reserve.o 00:02:52.310 CC examples/sock/hello_world/hello_sock.o 00:02:52.310 CC examples/nvme/abort/abort.o 00:02:52.310 CC test/nvme/reset/reset.o 00:02:52.310 CC test/nvme/aer/aer.o 00:02:52.310 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:52.310 CC examples/ioat/perf/perf.o 00:02:52.310 CC test/event/event_perf/event_perf.o 00:02:52.310 CC test/env/pci/pci_ut.o 00:02:52.310 CC test/nvme/err_injection/err_injection.o 00:02:52.310 CC test/nvme/sgl/sgl.o 00:02:52.310 CC test/env/memory/memory_ut.o 00:02:52.310 CC app/fio/bdev/fio_plugin.o 00:02:52.310 CC test/nvme/e2edp/nvme_dp.o 00:02:52.310 CC test/nvme/simple_copy/simple_copy.o 00:02:52.310 CC test/nvme/startup/startup.o 00:02:52.310 CC test/nvme/boot_partition/boot_partition.o 00:02:52.310 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:52.310 CC test/nvme/compliance/nvme_compliance.o 00:02:52.310 CC test/env/vtophys/vtophys.o 00:02:52.310 CC examples/accel/perf/accel_perf.o 00:02:52.310 CC test/nvme/fused_ordering/fused_ordering.o 00:02:52.310 CC test/nvme/connect_stress/connect_stress.o 00:02:52.310 CC examples/util/zipf/zipf.o 00:02:52.310 CC test/nvme/fdp/fdp.o 00:02:52.310 CC test/event/app_repeat/app_repeat.o 00:02:52.310 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:52.310 CC examples/idxd/perf/perf.o 00:02:52.310 CC test/nvme/cuse/cuse.o 00:02:52.310 CXX test/cpp_headers/ioat.o 00:02:52.310 CC examples/ioat/verify/verify.o 00:02:52.310 CC examples/bdev/hello_world/hello_bdev.o 00:02:52.310 LINK spdk_nvme_discover 00:02:52.310 CC examples/blob/hello_world/hello_blob.o 00:02:52.310 CC test/app/bdev_svc/bdev_svc.o 00:02:52.310 CC examples/blob/cli/blobcli.o 00:02:52.310 CC test/event/scheduler/scheduler.o 00:02:52.310 CC test/accel/dif/dif.o 00:02:52.310 CC test/dma/test_dma/test_dma.o 00:02:52.310 CC examples/bdev/bdevperf/bdevperf.o 00:02:52.310 CC test/bdev/bdevio/bdevio.o 00:02:52.310 LINK rpc_client_test 00:02:52.310 CC test/blobfs/mkfs/mkfs.o 00:02:52.310 CC examples/nvmf/nvmf/nvmf.o 00:02:52.310 CC examples/thread/thread/thread_ex.o 00:02:52.310 LINK nvmf_tgt 00:02:52.310 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:52.310 CC test/env/mem_callbacks/mem_callbacks.o 00:02:52.310 CC test/lvol/esnap/esnap.o 00:02:52.310 LINK spdk_trace_record 00:02:52.310 LINK iscsi_tgt 00:02:52.310 LINK led 00:02:52.310 LINK lsvmd 00:02:52.310 LINK jsoncat 00:02:52.310 LINK vhost 00:02:52.310 LINK interrupt_tgt 00:02:52.310 LINK reactor 00:02:52.310 CXX test/cpp_headers/ioat_spec.o 00:02:52.310 LINK reactor_perf 00:02:52.310 LINK spdk_tgt 00:02:52.310 CXX test/cpp_headers/iscsi_spec.o 00:02:52.310 CXX test/cpp_headers/json.o 00:02:52.310 LINK histogram_perf 00:02:52.310 CXX test/cpp_headers/jsonrpc.o 00:02:52.310 CXX test/cpp_headers/likely.o 00:02:52.310 LINK poller_perf 00:02:52.310 CXX test/cpp_headers/log.o 00:02:52.310 CXX test/cpp_headers/lvol.o 00:02:52.310 CXX test/cpp_headers/memory.o 00:02:52.310 CXX test/cpp_headers/mmio.o 00:02:52.310 LINK event_perf 00:02:52.310 CXX test/cpp_headers/nbd.o 00:02:52.310 CXX test/cpp_headers/notify.o 00:02:52.310 CXX test/cpp_headers/nvme.o 00:02:52.310 CXX test/cpp_headers/nvme_intel.o 00:02:52.310 CXX test/cpp_headers/nvme_ocssd.o 00:02:52.310 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:52.310 CXX test/cpp_headers/nvme_spec.o 00:02:52.310 CXX test/cpp_headers/nvme_zns.o 00:02:52.310 CXX test/cpp_headers/nvmf_cmd.o 00:02:52.310 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:52.310 CXX test/cpp_headers/nvmf.o 00:02:52.310 CXX test/cpp_headers/nvmf_spec.o 00:02:52.310 LINK vtophys 00:02:52.310 CXX test/cpp_headers/nvmf_transport.o 00:02:52.311 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:52.311 struct spdk_nvme_fdp_ruhs ruhs; 00:02:52.311 ^ 00:02:52.311 CXX test/cpp_headers/opal.o 00:02:52.311 LINK app_repeat 00:02:52.311 LINK stub 00:02:52.311 LINK env_dpdk_post_init 00:02:52.311 CXX test/cpp_headers/opal_spec.o 00:02:52.311 CXX test/cpp_headers/pci_ids.o 00:02:52.311 CXX test/cpp_headers/pipe.o 00:02:52.311 CXX test/cpp_headers/queue.o 00:02:52.311 LINK startup 00:02:52.311 CXX test/cpp_headers/reduce.o 00:02:52.311 LINK zipf 00:02:52.311 CXX test/cpp_headers/rpc.o 00:02:52.573 CXX test/cpp_headers/scheduler.o 00:02:52.573 LINK cmb_copy 00:02:52.573 LINK boot_partition 00:02:52.573 LINK err_injection 00:02:52.573 LINK pmr_persistence 00:02:52.573 CXX test/cpp_headers/scsi.o 00:02:52.573 CXX test/cpp_headers/scsi_spec.o 00:02:52.573 LINK fused_ordering 00:02:52.573 LINK connect_stress 00:02:52.573 LINK doorbell_aers 00:02:52.573 LINK reserve 00:02:52.573 LINK bdev_svc 00:02:52.573 LINK spdk_trace 00:02:52.573 CXX test/cpp_headers/sock.o 00:02:52.573 LINK ioat_perf 00:02:52.573 LINK verify 00:02:52.573 CXX test/cpp_headers/stdinc.o 00:02:52.573 LINK hello_world 00:02:52.573 LINK simple_copy 00:02:52.573 LINK hello_sock 00:02:52.573 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:52.573 LINK hotplug 00:02:52.573 LINK mkfs 00:02:52.573 LINK reset 00:02:52.574 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:52.574 LINK scheduler 00:02:52.574 LINK fdp 00:02:52.574 LINK hello_bdev 00:02:52.574 LINK hello_blob 00:02:52.574 LINK overhead 00:02:52.574 LINK sgl 00:02:52.574 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:52.574 LINK aer 00:02:52.574 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:52.574 LINK nvme_dp 00:02:52.574 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:52.574 LINK thread 00:02:52.574 CXX test/cpp_headers/string.o 00:02:52.574 CXX test/cpp_headers/thread.o 00:02:52.574 LINK reconnect 00:02:52.574 CXX test/cpp_headers/trace.o 00:02:52.574 CXX test/cpp_headers/trace_parser.o 00:02:52.574 CXX test/cpp_headers/tree.o 00:02:52.574 CXX test/cpp_headers/ublk.o 00:02:52.574 LINK idxd_perf 00:02:52.574 CXX test/cpp_headers/util.o 00:02:52.574 CXX test/cpp_headers/uuid.o 00:02:52.574 LINK spdk_dd 00:02:52.574 CXX test/cpp_headers/version.o 00:02:52.574 CXX test/cpp_headers/vfio_user_pci.o 00:02:52.574 CXX test/cpp_headers/vfio_user_spec.o 00:02:52.574 CXX test/cpp_headers/vhost.o 00:02:52.574 CXX test/cpp_headers/vmd.o 00:02:52.574 CXX test/cpp_headers/xor.o 00:02:52.574 CXX test/cpp_headers/zipf.o 00:02:52.834 LINK nvmf 00:02:52.834 LINK test_dma 00:02:52.834 LINK arbitration 00:02:52.834 LINK abort 00:02:52.834 LINK accel_perf 00:02:52.834 LINK pci_ut 00:02:52.834 LINK nvme_compliance 00:02:52.834 LINK nvme_manage 00:02:52.834 LINK dif 00:02:52.834 LINK bdevio 00:02:52.834 LINK nvme_fuzz 00:02:52.834 LINK blobcli 00:02:52.834 1 warning generated. 00:02:52.834 LINK mem_callbacks 00:02:53.093 LINK spdk_bdev 00:02:53.093 LINK spdk_nvme 00:02:53.093 LINK llvm_vfio_fuzz 00:02:53.093 LINK spdk_top 00:02:53.093 LINK spdk_nvme_perf 00:02:53.093 LINK spdk_nvme_identify 00:02:53.093 LINK bdevperf 00:02:53.093 LINK vhost_fuzz 00:02:53.093 LINK memory_ut 00:02:53.352 LINK cuse 00:02:53.352 LINK llvm_nvme_fuzz 00:02:53.917 LINK spdk_lock 00:02:53.917 LINK iscsi_fuzz 00:02:55.824 LINK esnap 00:02:56.083 00:02:56.083 real 0m42.041s 00:02:56.083 user 6m2.529s 00:02:56.083 sys 2m41.040s 00:02:56.083 21:25:34 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:56.083 21:25:34 -- common/autotest_common.sh@10 -- $ set +x 00:02:56.083 ************************************ 00:02:56.083 END TEST make 00:02:56.083 ************************************ 00:02:56.345 21:25:34 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:56.345 21:25:34 -- nvmf/common.sh@7 -- # uname -s 00:02:56.345 21:25:34 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:56.345 21:25:34 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:56.345 21:25:34 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:56.345 21:25:34 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:56.345 21:25:34 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:56.345 21:25:34 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:56.345 21:25:34 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:56.345 21:25:34 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:56.345 21:25:34 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:56.345 21:25:34 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:56.345 21:25:34 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:56.345 21:25:34 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:56.345 21:25:34 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:56.345 21:25:34 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:56.345 21:25:34 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:56.345 21:25:34 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:56.345 21:25:34 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:56.345 21:25:34 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:56.345 21:25:34 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:56.345 21:25:34 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:56.345 21:25:34 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:56.345 21:25:34 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:56.345 21:25:34 -- paths/export.sh@5 -- # export PATH 00:02:56.345 21:25:34 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:56.345 21:25:34 -- nvmf/common.sh@46 -- # : 0 00:02:56.345 21:25:34 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:56.345 21:25:34 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:56.345 21:25:34 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:56.345 21:25:34 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:56.345 21:25:34 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:56.345 21:25:34 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:56.345 21:25:34 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:56.345 21:25:34 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:56.345 21:25:34 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:56.345 21:25:34 -- spdk/autotest.sh@32 -- # uname -s 00:02:56.345 21:25:35 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:56.345 21:25:35 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:56.345 21:25:35 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:56.345 21:25:35 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:56.345 21:25:35 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:56.345 21:25:35 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:56.345 21:25:35 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:56.345 21:25:35 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:56.345 21:25:35 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:56.345 21:25:35 -- spdk/autotest.sh@48 -- # udevadm_pid=3507885 00:02:56.345 21:25:35 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:56.345 21:25:35 -- spdk/autotest.sh@54 -- # echo 3507887 00:02:56.345 21:25:35 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:56.345 21:25:35 -- spdk/autotest.sh@56 -- # echo 3507888 00:02:56.345 21:25:35 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:56.345 21:25:35 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:56.345 21:25:35 -- spdk/autotest.sh@60 -- # echo 3507889 00:02:56.345 21:25:35 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:56.345 21:25:35 -- spdk/autotest.sh@62 -- # echo 3507890 00:02:56.345 21:25:35 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:56.345 21:25:35 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:56.345 21:25:35 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:56.345 21:25:35 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:56.345 21:25:35 -- common/autotest_common.sh@10 -- # set +x 00:02:56.345 21:25:35 -- spdk/autotest.sh@70 -- # create_test_list 00:02:56.345 21:25:35 -- common/autotest_common.sh@736 -- # xtrace_disable 00:02:56.345 21:25:35 -- common/autotest_common.sh@10 -- # set +x 00:02:56.345 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:56.345 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:56.345 21:25:35 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:56.345 21:25:35 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:56.345 21:25:35 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:56.345 21:25:35 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:56.345 21:25:35 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:56.345 21:25:35 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:56.345 21:25:35 -- common/autotest_common.sh@1440 -- # uname 00:02:56.345 21:25:35 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:02:56.345 21:25:35 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:56.345 21:25:35 -- common/autotest_common.sh@1460 -- # uname 00:02:56.345 21:25:35 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:02:56.345 21:25:35 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:02:56.345 21:25:35 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:02:56.345 21:25:35 -- spdk/autotest.sh@83 -- # hash lcov 00:02:56.345 21:25:35 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:56.345 21:25:35 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:02:56.345 21:25:35 -- common/autotest_common.sh@712 -- # xtrace_disable 00:02:56.345 21:25:35 -- common/autotest_common.sh@10 -- # set +x 00:02:56.606 21:25:35 -- spdk/autotest.sh@102 -- # rm -f 00:02:56.606 21:25:35 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:00.021 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:00.021 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:00.021 21:25:38 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:00.021 21:25:38 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:00.021 21:25:38 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:00.021 21:25:38 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:00.021 21:25:38 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:00.021 21:25:38 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:00.021 21:25:38 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:00.021 21:25:38 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:00.021 21:25:38 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:00.021 21:25:38 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:00.021 21:25:38 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:00.021 21:25:38 -- spdk/autotest.sh@121 -- # grep -v p 00:03:00.021 21:25:38 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:00.021 21:25:38 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:00.021 21:25:38 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:00.021 21:25:38 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:00.021 21:25:38 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:00.021 No valid GPT data, bailing 00:03:00.021 21:25:38 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:00.021 21:25:38 -- scripts/common.sh@393 -- # pt= 00:03:00.021 21:25:38 -- scripts/common.sh@394 -- # return 1 00:03:00.021 21:25:38 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:00.021 1+0 records in 00:03:00.021 1+0 records out 00:03:00.021 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00158221 s, 663 MB/s 00:03:00.021 21:25:38 -- spdk/autotest.sh@129 -- # sync 00:03:00.021 21:25:38 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:00.021 21:25:38 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:00.021 21:25:38 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:06.596 21:25:44 -- spdk/autotest.sh@135 -- # uname -s 00:03:06.596 21:25:44 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:06.596 21:25:44 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:06.596 21:25:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:06.596 21:25:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:06.596 21:25:44 -- common/autotest_common.sh@10 -- # set +x 00:03:06.596 ************************************ 00:03:06.596 START TEST setup.sh 00:03:06.596 ************************************ 00:03:06.596 21:25:44 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:06.596 * Looking for test storage... 00:03:06.596 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:06.596 21:25:45 -- setup/test-setup.sh@10 -- # uname -s 00:03:06.596 21:25:45 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:06.596 21:25:45 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:06.596 21:25:45 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:06.596 21:25:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:06.596 21:25:45 -- common/autotest_common.sh@10 -- # set +x 00:03:06.596 ************************************ 00:03:06.596 START TEST acl 00:03:06.596 ************************************ 00:03:06.596 21:25:45 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:06.596 * Looking for test storage... 00:03:06.596 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:06.596 21:25:45 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:06.596 21:25:45 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:06.596 21:25:45 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:06.596 21:25:45 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:06.596 21:25:45 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:06.596 21:25:45 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:06.596 21:25:45 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:06.596 21:25:45 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:06.596 21:25:45 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:06.596 21:25:45 -- setup/acl.sh@12 -- # devs=() 00:03:06.596 21:25:45 -- setup/acl.sh@12 -- # declare -a devs 00:03:06.596 21:25:45 -- setup/acl.sh@13 -- # drivers=() 00:03:06.596 21:25:45 -- setup/acl.sh@13 -- # declare -A drivers 00:03:06.596 21:25:45 -- setup/acl.sh@51 -- # setup reset 00:03:06.596 21:25:45 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:06.596 21:25:45 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:09.879 21:25:48 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:09.879 21:25:48 -- setup/acl.sh@16 -- # local dev driver 00:03:09.879 21:25:48 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:09.879 21:25:48 -- setup/acl.sh@15 -- # setup output status 00:03:09.879 21:25:48 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:09.879 21:25:48 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:13.166 Hugepages 00:03:13.166 node hugesize free / total 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 00:03:13.166 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.166 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.166 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.166 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # continue 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:13.167 21:25:51 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:13.167 21:25:51 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:13.167 21:25:51 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:13.167 21:25:51 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:13.167 21:25:51 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:13.167 21:25:51 -- setup/acl.sh@54 -- # run_test denied denied 00:03:13.167 21:25:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:13.167 21:25:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:13.167 21:25:51 -- common/autotest_common.sh@10 -- # set +x 00:03:13.167 ************************************ 00:03:13.167 START TEST denied 00:03:13.167 ************************************ 00:03:13.167 21:25:51 -- common/autotest_common.sh@1104 -- # denied 00:03:13.167 21:25:51 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:13.167 21:25:51 -- setup/acl.sh@38 -- # setup output config 00:03:13.167 21:25:51 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:13.167 21:25:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:13.167 21:25:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:16.453 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:16.453 21:25:54 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:16.453 21:25:54 -- setup/acl.sh@28 -- # local dev driver 00:03:16.453 21:25:54 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:16.453 21:25:54 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:16.453 21:25:54 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:16.453 21:25:54 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:16.453 21:25:54 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:16.453 21:25:54 -- setup/acl.sh@41 -- # setup reset 00:03:16.453 21:25:54 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:16.453 21:25:54 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:20.643 00:03:20.643 real 0m7.336s 00:03:20.643 user 0m2.087s 00:03:20.643 sys 0m4.463s 00:03:20.643 21:25:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:20.643 21:25:58 -- common/autotest_common.sh@10 -- # set +x 00:03:20.643 ************************************ 00:03:20.643 END TEST denied 00:03:20.643 ************************************ 00:03:20.643 21:25:59 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:20.643 21:25:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:20.643 21:25:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:20.643 21:25:59 -- common/autotest_common.sh@10 -- # set +x 00:03:20.643 ************************************ 00:03:20.643 START TEST allowed 00:03:20.643 ************************************ 00:03:20.643 21:25:59 -- common/autotest_common.sh@1104 -- # allowed 00:03:20.643 21:25:59 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:20.643 21:25:59 -- setup/acl.sh@45 -- # setup output config 00:03:20.643 21:25:59 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:20.643 21:25:59 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:20.643 21:25:59 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:25.913 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:25.913 21:26:03 -- setup/acl.sh@47 -- # verify 00:03:25.913 21:26:03 -- setup/acl.sh@28 -- # local dev driver 00:03:25.913 21:26:03 -- setup/acl.sh@48 -- # setup reset 00:03:25.913 21:26:03 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:25.913 21:26:03 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:29.201 00:03:29.201 real 0m8.255s 00:03:29.201 user 0m2.153s 00:03:29.201 sys 0m4.598s 00:03:29.201 21:26:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:29.201 21:26:07 -- common/autotest_common.sh@10 -- # set +x 00:03:29.201 ************************************ 00:03:29.201 END TEST allowed 00:03:29.201 ************************************ 00:03:29.201 00:03:29.201 real 0m22.270s 00:03:29.201 user 0m6.521s 00:03:29.201 sys 0m13.540s 00:03:29.201 21:26:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:29.201 21:26:07 -- common/autotest_common.sh@10 -- # set +x 00:03:29.201 ************************************ 00:03:29.201 END TEST acl 00:03:29.201 ************************************ 00:03:29.201 21:26:07 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:29.201 21:26:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:29.201 21:26:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:29.201 21:26:07 -- common/autotest_common.sh@10 -- # set +x 00:03:29.201 ************************************ 00:03:29.201 START TEST hugepages 00:03:29.201 ************************************ 00:03:29.201 21:26:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:29.201 * Looking for test storage... 00:03:29.201 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:29.201 21:26:07 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:29.201 21:26:07 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:29.201 21:26:07 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:29.201 21:26:07 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:29.201 21:26:07 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:29.201 21:26:07 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:29.201 21:26:07 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:29.201 21:26:07 -- setup/common.sh@18 -- # local node= 00:03:29.201 21:26:07 -- setup/common.sh@19 -- # local var val 00:03:29.201 21:26:07 -- setup/common.sh@20 -- # local mem_f mem 00:03:29.201 21:26:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:29.201 21:26:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:29.201 21:26:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:29.201 21:26:07 -- setup/common.sh@28 -- # mapfile -t mem 00:03:29.201 21:26:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:29.201 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.201 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41197576 kB' 'MemAvailable: 44736640 kB' 'Buffers: 11500 kB' 'Cached: 10713988 kB' 'SwapCached: 0 kB' 'Active: 7812540 kB' 'Inactive: 3440680 kB' 'Active(anon): 7331484 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531040 kB' 'Mapped: 174712 kB' 'Shmem: 6803752 kB' 'KReclaimable: 228916 kB' 'Slab: 738748 kB' 'SReclaimable: 228916 kB' 'SUnreclaim: 509832 kB' 'KernelStack: 21792 kB' 'PageTables: 7988 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439068 kB' 'Committed_AS: 8674852 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213396 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.202 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.202 21:26:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # continue 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # IFS=': ' 00:03:29.203 21:26:07 -- setup/common.sh@31 -- # read -r var val _ 00:03:29.203 21:26:07 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:29.203 21:26:07 -- setup/common.sh@33 -- # echo 2048 00:03:29.203 21:26:07 -- setup/common.sh@33 -- # return 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:29.203 21:26:07 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:29.203 21:26:07 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:29.203 21:26:07 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:29.203 21:26:07 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:29.203 21:26:07 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:29.203 21:26:07 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:29.203 21:26:07 -- setup/hugepages.sh@207 -- # get_nodes 00:03:29.203 21:26:07 -- setup/hugepages.sh@27 -- # local node 00:03:29.203 21:26:07 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.203 21:26:07 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:29.203 21:26:07 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:29.203 21:26:07 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:29.203 21:26:07 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:29.203 21:26:07 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:29.203 21:26:07 -- setup/hugepages.sh@208 -- # clear_hp 00:03:29.203 21:26:07 -- setup/hugepages.sh@37 -- # local node hp 00:03:29.203 21:26:07 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:29.203 21:26:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:29.203 21:26:07 -- setup/hugepages.sh@41 -- # echo 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:29.203 21:26:07 -- setup/hugepages.sh@41 -- # echo 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:29.203 21:26:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:29.203 21:26:07 -- setup/hugepages.sh@41 -- # echo 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:29.203 21:26:07 -- setup/hugepages.sh@41 -- # echo 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:29.203 21:26:07 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:29.203 21:26:07 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:29.203 21:26:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:29.203 21:26:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:29.203 21:26:07 -- common/autotest_common.sh@10 -- # set +x 00:03:29.203 ************************************ 00:03:29.203 START TEST default_setup 00:03:29.203 ************************************ 00:03:29.203 21:26:07 -- common/autotest_common.sh@1104 -- # default_setup 00:03:29.203 21:26:07 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:29.203 21:26:07 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:29.203 21:26:07 -- setup/hugepages.sh@51 -- # shift 00:03:29.203 21:26:07 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:29.203 21:26:07 -- setup/hugepages.sh@52 -- # local node_ids 00:03:29.203 21:26:07 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:29.203 21:26:07 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:29.203 21:26:07 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:29.203 21:26:07 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:29.203 21:26:07 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:29.203 21:26:07 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:29.203 21:26:07 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:29.203 21:26:07 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:29.203 21:26:07 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:29.203 21:26:07 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:29.203 21:26:07 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:29.203 21:26:07 -- setup/hugepages.sh@73 -- # return 0 00:03:29.203 21:26:07 -- setup/hugepages.sh@137 -- # setup output 00:03:29.203 21:26:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:29.203 21:26:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:32.489 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:32.489 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:33.871 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:33.871 21:26:12 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:33.871 21:26:12 -- setup/hugepages.sh@89 -- # local node 00:03:33.871 21:26:12 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:33.871 21:26:12 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:33.871 21:26:12 -- setup/hugepages.sh@92 -- # local surp 00:03:33.871 21:26:12 -- setup/hugepages.sh@93 -- # local resv 00:03:33.871 21:26:12 -- setup/hugepages.sh@94 -- # local anon 00:03:33.871 21:26:12 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:33.871 21:26:12 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:33.871 21:26:12 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:33.871 21:26:12 -- setup/common.sh@18 -- # local node= 00:03:33.871 21:26:12 -- setup/common.sh@19 -- # local var val 00:03:33.871 21:26:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.871 21:26:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.871 21:26:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.871 21:26:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.871 21:26:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.871 21:26:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.871 21:26:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43399684 kB' 'MemAvailable: 46938652 kB' 'Buffers: 11500 kB' 'Cached: 10714108 kB' 'SwapCached: 0 kB' 'Active: 7830852 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349796 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548904 kB' 'Mapped: 175060 kB' 'Shmem: 6803872 kB' 'KReclaimable: 228724 kB' 'Slab: 737108 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508384 kB' 'KernelStack: 22096 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8695072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.871 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.871 21:26:12 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.872 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.872 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.873 21:26:12 -- setup/common.sh@33 -- # echo 0 00:03:33.873 21:26:12 -- setup/common.sh@33 -- # return 0 00:03:33.873 21:26:12 -- setup/hugepages.sh@97 -- # anon=0 00:03:33.873 21:26:12 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:33.873 21:26:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.873 21:26:12 -- setup/common.sh@18 -- # local node= 00:03:33.873 21:26:12 -- setup/common.sh@19 -- # local var val 00:03:33.873 21:26:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.873 21:26:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.873 21:26:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.873 21:26:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.873 21:26:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.873 21:26:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43403284 kB' 'MemAvailable: 46942252 kB' 'Buffers: 11500 kB' 'Cached: 10714108 kB' 'SwapCached: 0 kB' 'Active: 7830428 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349372 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548972 kB' 'Mapped: 174940 kB' 'Shmem: 6803872 kB' 'KReclaimable: 228724 kB' 'Slab: 737088 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508364 kB' 'KernelStack: 22064 kB' 'PageTables: 8336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8696476 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.873 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.873 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.874 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.874 21:26:12 -- setup/common.sh@33 -- # echo 0 00:03:33.874 21:26:12 -- setup/common.sh@33 -- # return 0 00:03:33.874 21:26:12 -- setup/hugepages.sh@99 -- # surp=0 00:03:33.874 21:26:12 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:33.874 21:26:12 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:33.874 21:26:12 -- setup/common.sh@18 -- # local node= 00:03:33.874 21:26:12 -- setup/common.sh@19 -- # local var val 00:03:33.874 21:26:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.874 21:26:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.874 21:26:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.874 21:26:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.874 21:26:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.874 21:26:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.874 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43401440 kB' 'MemAvailable: 46940408 kB' 'Buffers: 11500 kB' 'Cached: 10714120 kB' 'SwapCached: 0 kB' 'Active: 7830712 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349656 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 549236 kB' 'Mapped: 174940 kB' 'Shmem: 6803884 kB' 'KReclaimable: 228724 kB' 'Slab: 737080 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508356 kB' 'KernelStack: 22112 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8696492 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213492 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.875 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.875 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.876 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.876 21:26:12 -- setup/common.sh@33 -- # echo 0 00:03:33.876 21:26:12 -- setup/common.sh@33 -- # return 0 00:03:33.876 21:26:12 -- setup/hugepages.sh@100 -- # resv=0 00:03:33.876 21:26:12 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:33.876 nr_hugepages=1024 00:03:33.876 21:26:12 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:33.876 resv_hugepages=0 00:03:33.876 21:26:12 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:33.876 surplus_hugepages=0 00:03:33.876 21:26:12 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:33.876 anon_hugepages=0 00:03:33.876 21:26:12 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.876 21:26:12 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:33.876 21:26:12 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:33.876 21:26:12 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:33.876 21:26:12 -- setup/common.sh@18 -- # local node= 00:03:33.876 21:26:12 -- setup/common.sh@19 -- # local var val 00:03:33.876 21:26:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.876 21:26:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.876 21:26:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.876 21:26:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.876 21:26:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.876 21:26:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.876 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43400844 kB' 'MemAvailable: 46939812 kB' 'Buffers: 11500 kB' 'Cached: 10714136 kB' 'SwapCached: 0 kB' 'Active: 7830232 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349176 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548688 kB' 'Mapped: 174940 kB' 'Shmem: 6803900 kB' 'KReclaimable: 228724 kB' 'Slab: 737084 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508360 kB' 'KernelStack: 22048 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8696504 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213492 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.877 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.877 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.878 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.878 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.878 21:26:12 -- setup/common.sh@33 -- # echo 1024 00:03:33.878 21:26:12 -- setup/common.sh@33 -- # return 0 00:03:33.878 21:26:12 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.878 21:26:12 -- setup/hugepages.sh@112 -- # get_nodes 00:03:33.878 21:26:12 -- setup/hugepages.sh@27 -- # local node 00:03:33.878 21:26:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.878 21:26:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:33.878 21:26:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.879 21:26:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:33.879 21:26:12 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:33.879 21:26:12 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:33.879 21:26:12 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.879 21:26:12 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.879 21:26:12 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:33.879 21:26:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.879 21:26:12 -- setup/common.sh@18 -- # local node=0 00:03:33.879 21:26:12 -- setup/common.sh@19 -- # local var val 00:03:33.879 21:26:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.879 21:26:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.879 21:26:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:33.879 21:26:12 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:33.879 21:26:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.879 21:26:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26337016 kB' 'MemUsed: 6255068 kB' 'SwapCached: 0 kB' 'Active: 2422892 kB' 'Inactive: 153644 kB' 'Active(anon): 2208652 kB' 'Inactive(anon): 0 kB' 'Active(file): 214240 kB' 'Inactive(file): 153644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2336908 kB' 'Mapped: 95604 kB' 'AnonPages: 242792 kB' 'Shmem: 1969024 kB' 'KernelStack: 13048 kB' 'PageTables: 4872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114256 kB' 'Slab: 346504 kB' 'SReclaimable: 114256 kB' 'SUnreclaim: 232248 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.879 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.879 21:26:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # continue 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.880 21:26:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.880 21:26:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.880 21:26:12 -- setup/common.sh@33 -- # echo 0 00:03:33.880 21:26:12 -- setup/common.sh@33 -- # return 0 00:03:33.880 21:26:12 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.880 21:26:12 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.880 21:26:12 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.880 21:26:12 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.880 21:26:12 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:33.880 node0=1024 expecting 1024 00:03:33.880 21:26:12 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:33.880 00:03:33.880 real 0m4.988s 00:03:33.880 user 0m1.244s 00:03:33.880 sys 0m2.128s 00:03:33.880 21:26:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:33.880 21:26:12 -- common/autotest_common.sh@10 -- # set +x 00:03:33.880 ************************************ 00:03:33.880 END TEST default_setup 00:03:33.880 ************************************ 00:03:33.880 21:26:12 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:33.880 21:26:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:33.880 21:26:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:33.880 21:26:12 -- common/autotest_common.sh@10 -- # set +x 00:03:33.880 ************************************ 00:03:33.880 START TEST per_node_1G_alloc 00:03:33.880 ************************************ 00:03:33.880 21:26:12 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:03:33.880 21:26:12 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:33.880 21:26:12 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:33.880 21:26:12 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:33.880 21:26:12 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:33.880 21:26:12 -- setup/hugepages.sh@51 -- # shift 00:03:33.880 21:26:12 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:33.880 21:26:12 -- setup/hugepages.sh@52 -- # local node_ids 00:03:33.880 21:26:12 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:33.880 21:26:12 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:33.880 21:26:12 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:33.880 21:26:12 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:33.880 21:26:12 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.880 21:26:12 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:33.880 21:26:12 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.880 21:26:12 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.880 21:26:12 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.880 21:26:12 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:33.880 21:26:12 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:33.880 21:26:12 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:33.880 21:26:12 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:33.880 21:26:12 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:33.880 21:26:12 -- setup/hugepages.sh@73 -- # return 0 00:03:33.880 21:26:12 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:33.880 21:26:12 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:33.880 21:26:12 -- setup/hugepages.sh@146 -- # setup output 00:03:33.880 21:26:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.880 21:26:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:37.168 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.168 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.430 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.430 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.430 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.430 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.430 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.430 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:37.430 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:37.430 21:26:16 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:37.430 21:26:16 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:37.430 21:26:16 -- setup/hugepages.sh@89 -- # local node 00:03:37.430 21:26:16 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:37.430 21:26:16 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:37.430 21:26:16 -- setup/hugepages.sh@92 -- # local surp 00:03:37.430 21:26:16 -- setup/hugepages.sh@93 -- # local resv 00:03:37.430 21:26:16 -- setup/hugepages.sh@94 -- # local anon 00:03:37.430 21:26:16 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:37.430 21:26:16 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:37.430 21:26:16 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:37.430 21:26:16 -- setup/common.sh@18 -- # local node= 00:03:37.430 21:26:16 -- setup/common.sh@19 -- # local var val 00:03:37.430 21:26:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.430 21:26:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.430 21:26:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.430 21:26:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.430 21:26:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.430 21:26:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43444352 kB' 'MemAvailable: 46983320 kB' 'Buffers: 11500 kB' 'Cached: 10714228 kB' 'SwapCached: 0 kB' 'Active: 7829072 kB' 'Inactive: 3440680 kB' 'Active(anon): 7348016 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 546796 kB' 'Mapped: 173876 kB' 'Shmem: 6803992 kB' 'KReclaimable: 228724 kB' 'Slab: 737412 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508688 kB' 'KernelStack: 21856 kB' 'PageTables: 7912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8681916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213396 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.430 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.430 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:37.431 21:26:16 -- setup/common.sh@33 -- # echo 0 00:03:37.431 21:26:16 -- setup/common.sh@33 -- # return 0 00:03:37.431 21:26:16 -- setup/hugepages.sh@97 -- # anon=0 00:03:37.431 21:26:16 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:37.431 21:26:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.431 21:26:16 -- setup/common.sh@18 -- # local node= 00:03:37.431 21:26:16 -- setup/common.sh@19 -- # local var val 00:03:37.431 21:26:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.431 21:26:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.431 21:26:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.431 21:26:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.431 21:26:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.431 21:26:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43447484 kB' 'MemAvailable: 46986452 kB' 'Buffers: 11500 kB' 'Cached: 10714232 kB' 'SwapCached: 0 kB' 'Active: 7828284 kB' 'Inactive: 3440680 kB' 'Active(anon): 7347228 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 546468 kB' 'Mapped: 173756 kB' 'Shmem: 6803996 kB' 'KReclaimable: 228724 kB' 'Slab: 737436 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508712 kB' 'KernelStack: 21824 kB' 'PageTables: 7832 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8682872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.431 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.431 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.432 21:26:16 -- setup/common.sh@33 -- # echo 0 00:03:37.432 21:26:16 -- setup/common.sh@33 -- # return 0 00:03:37.432 21:26:16 -- setup/hugepages.sh@99 -- # surp=0 00:03:37.432 21:26:16 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:37.432 21:26:16 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:37.432 21:26:16 -- setup/common.sh@18 -- # local node= 00:03:37.432 21:26:16 -- setup/common.sh@19 -- # local var val 00:03:37.432 21:26:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.432 21:26:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.432 21:26:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.432 21:26:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.432 21:26:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.432 21:26:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.432 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.432 21:26:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43444112 kB' 'MemAvailable: 46983080 kB' 'Buffers: 11500 kB' 'Cached: 10714244 kB' 'SwapCached: 0 kB' 'Active: 7832424 kB' 'Inactive: 3440680 kB' 'Active(anon): 7351368 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 550644 kB' 'Mapped: 174260 kB' 'Shmem: 6804008 kB' 'KReclaimable: 228724 kB' 'Slab: 737436 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508712 kB' 'KernelStack: 21840 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8686864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213396 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.432 21:26:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.433 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.433 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:37.434 21:26:16 -- setup/common.sh@33 -- # echo 0 00:03:37.434 21:26:16 -- setup/common.sh@33 -- # return 0 00:03:37.434 21:26:16 -- setup/hugepages.sh@100 -- # resv=0 00:03:37.434 21:26:16 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:37.434 nr_hugepages=1024 00:03:37.434 21:26:16 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:37.434 resv_hugepages=0 00:03:37.434 21:26:16 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:37.434 surplus_hugepages=0 00:03:37.434 21:26:16 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:37.434 anon_hugepages=0 00:03:37.434 21:26:16 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:37.434 21:26:16 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:37.434 21:26:16 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:37.434 21:26:16 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:37.434 21:26:16 -- setup/common.sh@18 -- # local node= 00:03:37.434 21:26:16 -- setup/common.sh@19 -- # local var val 00:03:37.434 21:26:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.434 21:26:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.434 21:26:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:37.434 21:26:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:37.434 21:26:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.434 21:26:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43447728 kB' 'MemAvailable: 46986696 kB' 'Buffers: 11500 kB' 'Cached: 10714260 kB' 'SwapCached: 0 kB' 'Active: 7828832 kB' 'Inactive: 3440680 kB' 'Active(anon): 7347776 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 546960 kB' 'Mapped: 174260 kB' 'Shmem: 6804024 kB' 'KReclaimable: 228724 kB' 'Slab: 737436 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508712 kB' 'KernelStack: 21824 kB' 'PageTables: 7820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8683448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.434 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.434 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.695 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.695 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:37.696 21:26:16 -- setup/common.sh@33 -- # echo 1024 00:03:37.696 21:26:16 -- setup/common.sh@33 -- # return 0 00:03:37.696 21:26:16 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:37.696 21:26:16 -- setup/hugepages.sh@112 -- # get_nodes 00:03:37.696 21:26:16 -- setup/hugepages.sh@27 -- # local node 00:03:37.696 21:26:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:37.696 21:26:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:37.696 21:26:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:37.696 21:26:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:37.696 21:26:16 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:37.696 21:26:16 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:37.696 21:26:16 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:37.696 21:26:16 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:37.696 21:26:16 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:37.696 21:26:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.696 21:26:16 -- setup/common.sh@18 -- # local node=0 00:03:37.696 21:26:16 -- setup/common.sh@19 -- # local var val 00:03:37.696 21:26:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.696 21:26:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.696 21:26:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:37.696 21:26:16 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:37.696 21:26:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.696 21:26:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 27416680 kB' 'MemUsed: 5175404 kB' 'SwapCached: 0 kB' 'Active: 2427200 kB' 'Inactive: 153644 kB' 'Active(anon): 2212960 kB' 'Inactive(anon): 0 kB' 'Active(file): 214240 kB' 'Inactive(file): 153644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2336952 kB' 'Mapped: 96040 kB' 'AnonPages: 247072 kB' 'Shmem: 1969068 kB' 'KernelStack: 12824 kB' 'PageTables: 4400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114256 kB' 'Slab: 346868 kB' 'SReclaimable: 114256 kB' 'SUnreclaim: 232612 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.696 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.696 21:26:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@33 -- # echo 0 00:03:37.697 21:26:16 -- setup/common.sh@33 -- # return 0 00:03:37.697 21:26:16 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:37.697 21:26:16 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:37.697 21:26:16 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:37.697 21:26:16 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:37.697 21:26:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:37.697 21:26:16 -- setup/common.sh@18 -- # local node=1 00:03:37.697 21:26:16 -- setup/common.sh@19 -- # local var val 00:03:37.697 21:26:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:37.697 21:26:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:37.697 21:26:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:37.697 21:26:16 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:37.697 21:26:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:37.697 21:26:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16032252 kB' 'MemUsed: 11670896 kB' 'SwapCached: 0 kB' 'Active: 5407072 kB' 'Inactive: 3287036 kB' 'Active(anon): 5140256 kB' 'Inactive(anon): 0 kB' 'Active(file): 266816 kB' 'Inactive(file): 3287036 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8388836 kB' 'Mapped: 78592 kB' 'AnonPages: 305396 kB' 'Shmem: 4834984 kB' 'KernelStack: 9032 kB' 'PageTables: 3584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114468 kB' 'Slab: 390568 kB' 'SReclaimable: 114468 kB' 'SUnreclaim: 276100 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.697 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.697 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # continue 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:37.698 21:26:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:37.698 21:26:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:37.698 21:26:16 -- setup/common.sh@33 -- # echo 0 00:03:37.698 21:26:16 -- setup/common.sh@33 -- # return 0 00:03:37.698 21:26:16 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:37.698 21:26:16 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:37.698 21:26:16 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:37.698 21:26:16 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:37.698 21:26:16 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:37.698 node0=512 expecting 512 00:03:37.698 21:26:16 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:37.698 21:26:16 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:37.698 21:26:16 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:37.698 21:26:16 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:37.698 node1=512 expecting 512 00:03:37.698 21:26:16 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:37.698 00:03:37.698 real 0m3.703s 00:03:37.698 user 0m1.433s 00:03:37.698 sys 0m2.340s 00:03:37.698 21:26:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:37.698 21:26:16 -- common/autotest_common.sh@10 -- # set +x 00:03:37.698 ************************************ 00:03:37.698 END TEST per_node_1G_alloc 00:03:37.698 ************************************ 00:03:37.698 21:26:16 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:37.698 21:26:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:37.698 21:26:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:37.698 21:26:16 -- common/autotest_common.sh@10 -- # set +x 00:03:37.698 ************************************ 00:03:37.698 START TEST even_2G_alloc 00:03:37.698 ************************************ 00:03:37.698 21:26:16 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:03:37.698 21:26:16 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:37.698 21:26:16 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:37.698 21:26:16 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:37.698 21:26:16 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:37.698 21:26:16 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:37.698 21:26:16 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:37.698 21:26:16 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:37.698 21:26:16 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:37.698 21:26:16 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:37.698 21:26:16 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:37.698 21:26:16 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:37.698 21:26:16 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:37.698 21:26:16 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:37.698 21:26:16 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:37.698 21:26:16 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.698 21:26:16 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:37.698 21:26:16 -- setup/hugepages.sh@83 -- # : 512 00:03:37.699 21:26:16 -- setup/hugepages.sh@84 -- # : 1 00:03:37.699 21:26:16 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.699 21:26:16 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:37.699 21:26:16 -- setup/hugepages.sh@83 -- # : 0 00:03:37.699 21:26:16 -- setup/hugepages.sh@84 -- # : 0 00:03:37.699 21:26:16 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:37.699 21:26:16 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:37.699 21:26:16 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:37.699 21:26:16 -- setup/hugepages.sh@153 -- # setup output 00:03:37.699 21:26:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.699 21:26:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:40.994 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:40.994 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:40.994 21:26:19 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:40.994 21:26:19 -- setup/hugepages.sh@89 -- # local node 00:03:40.994 21:26:19 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:40.994 21:26:19 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:40.994 21:26:19 -- setup/hugepages.sh@92 -- # local surp 00:03:40.994 21:26:19 -- setup/hugepages.sh@93 -- # local resv 00:03:40.994 21:26:19 -- setup/hugepages.sh@94 -- # local anon 00:03:40.994 21:26:19 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:40.994 21:26:19 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:40.994 21:26:19 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:40.994 21:26:19 -- setup/common.sh@18 -- # local node= 00:03:40.994 21:26:19 -- setup/common.sh@19 -- # local var val 00:03:40.994 21:26:19 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.994 21:26:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.994 21:26:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:40.994 21:26:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:40.994 21:26:19 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.994 21:26:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43470248 kB' 'MemAvailable: 47009216 kB' 'Buffers: 11500 kB' 'Cached: 10714364 kB' 'SwapCached: 0 kB' 'Active: 7829816 kB' 'Inactive: 3440680 kB' 'Active(anon): 7348760 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547468 kB' 'Mapped: 173872 kB' 'Shmem: 6804128 kB' 'KReclaimable: 228724 kB' 'Slab: 736852 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508128 kB' 'KernelStack: 21856 kB' 'PageTables: 7932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8682712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.994 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.994 21:26:19 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:40.995 21:26:19 -- setup/common.sh@33 -- # echo 0 00:03:40.995 21:26:19 -- setup/common.sh@33 -- # return 0 00:03:40.995 21:26:19 -- setup/hugepages.sh@97 -- # anon=0 00:03:40.995 21:26:19 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:40.995 21:26:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:40.995 21:26:19 -- setup/common.sh@18 -- # local node= 00:03:40.995 21:26:19 -- setup/common.sh@19 -- # local var val 00:03:40.995 21:26:19 -- setup/common.sh@20 -- # local mem_f mem 00:03:40.995 21:26:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:40.995 21:26:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:40.995 21:26:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:40.995 21:26:19 -- setup/common.sh@28 -- # mapfile -t mem 00:03:40.995 21:26:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43474020 kB' 'MemAvailable: 47012988 kB' 'Buffers: 11500 kB' 'Cached: 10714364 kB' 'SwapCached: 0 kB' 'Active: 7830136 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349080 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547816 kB' 'Mapped: 173872 kB' 'Shmem: 6804128 kB' 'KReclaimable: 228724 kB' 'Slab: 736844 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508120 kB' 'KernelStack: 21824 kB' 'PageTables: 7816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8682724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:40.995 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:40.995 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.355 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.355 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.356 21:26:19 -- setup/common.sh@33 -- # echo 0 00:03:41.356 21:26:19 -- setup/common.sh@33 -- # return 0 00:03:41.356 21:26:19 -- setup/hugepages.sh@99 -- # surp=0 00:03:41.356 21:26:19 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:41.356 21:26:19 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:41.356 21:26:19 -- setup/common.sh@18 -- # local node= 00:03:41.356 21:26:19 -- setup/common.sh@19 -- # local var val 00:03:41.356 21:26:19 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.356 21:26:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.356 21:26:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.356 21:26:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.356 21:26:19 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.356 21:26:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43487476 kB' 'MemAvailable: 47026444 kB' 'Buffers: 11500 kB' 'Cached: 10714380 kB' 'SwapCached: 0 kB' 'Active: 7829120 kB' 'Inactive: 3440680 kB' 'Active(anon): 7348064 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547288 kB' 'Mapped: 173760 kB' 'Shmem: 6804144 kB' 'KReclaimable: 228724 kB' 'Slab: 736844 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508120 kB' 'KernelStack: 21856 kB' 'PageTables: 7900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8682588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.356 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.356 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.357 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.357 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.357 21:26:19 -- setup/common.sh@33 -- # echo 0 00:03:41.357 21:26:19 -- setup/common.sh@33 -- # return 0 00:03:41.357 21:26:19 -- setup/hugepages.sh@100 -- # resv=0 00:03:41.357 21:26:19 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:41.357 nr_hugepages=1024 00:03:41.357 21:26:19 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:41.357 resv_hugepages=0 00:03:41.357 21:26:19 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:41.357 surplus_hugepages=0 00:03:41.357 21:26:19 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:41.357 anon_hugepages=0 00:03:41.357 21:26:19 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.358 21:26:19 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:41.358 21:26:19 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:41.358 21:26:19 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:41.358 21:26:19 -- setup/common.sh@18 -- # local node= 00:03:41.358 21:26:19 -- setup/common.sh@19 -- # local var val 00:03:41.358 21:26:19 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.358 21:26:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.358 21:26:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.358 21:26:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.358 21:26:19 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.358 21:26:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43487736 kB' 'MemAvailable: 47026704 kB' 'Buffers: 11500 kB' 'Cached: 10714396 kB' 'SwapCached: 0 kB' 'Active: 7829084 kB' 'Inactive: 3440680 kB' 'Active(anon): 7348028 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547248 kB' 'Mapped: 173760 kB' 'Shmem: 6804160 kB' 'KReclaimable: 228724 kB' 'Slab: 736844 kB' 'SReclaimable: 228724 kB' 'SUnreclaim: 508120 kB' 'KernelStack: 21840 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8682736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213332 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.358 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.358 21:26:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.359 21:26:19 -- setup/common.sh@33 -- # echo 1024 00:03:41.359 21:26:19 -- setup/common.sh@33 -- # return 0 00:03:41.359 21:26:19 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.359 21:26:19 -- setup/hugepages.sh@112 -- # get_nodes 00:03:41.359 21:26:19 -- setup/hugepages.sh@27 -- # local node 00:03:41.359 21:26:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.359 21:26:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:41.359 21:26:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.359 21:26:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:41.359 21:26:19 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:41.359 21:26:19 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:41.359 21:26:19 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.359 21:26:19 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.359 21:26:19 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:41.359 21:26:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.359 21:26:19 -- setup/common.sh@18 -- # local node=0 00:03:41.359 21:26:19 -- setup/common.sh@19 -- # local var val 00:03:41.359 21:26:19 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.359 21:26:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.359 21:26:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:41.359 21:26:19 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:41.359 21:26:19 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.359 21:26:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 27440064 kB' 'MemUsed: 5152020 kB' 'SwapCached: 0 kB' 'Active: 2421348 kB' 'Inactive: 153644 kB' 'Active(anon): 2207108 kB' 'Inactive(anon): 0 kB' 'Active(file): 214240 kB' 'Inactive(file): 153644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2337016 kB' 'Mapped: 95316 kB' 'AnonPages: 241188 kB' 'Shmem: 1969132 kB' 'KernelStack: 12792 kB' 'PageTables: 4180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114256 kB' 'Slab: 346216 kB' 'SReclaimable: 114256 kB' 'SUnreclaim: 231960 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.359 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.359 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@33 -- # echo 0 00:03:41.360 21:26:19 -- setup/common.sh@33 -- # return 0 00:03:41.360 21:26:19 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.360 21:26:19 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.360 21:26:19 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.360 21:26:19 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:41.360 21:26:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.360 21:26:19 -- setup/common.sh@18 -- # local node=1 00:03:41.360 21:26:19 -- setup/common.sh@19 -- # local var val 00:03:41.360 21:26:19 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.360 21:26:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.360 21:26:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:41.360 21:26:19 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:41.360 21:26:19 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.360 21:26:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16047672 kB' 'MemUsed: 11655476 kB' 'SwapCached: 0 kB' 'Active: 5407736 kB' 'Inactive: 3287036 kB' 'Active(anon): 5140920 kB' 'Inactive(anon): 0 kB' 'Active(file): 266816 kB' 'Inactive(file): 3287036 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8388880 kB' 'Mapped: 78444 kB' 'AnonPages: 306060 kB' 'Shmem: 4835028 kB' 'KernelStack: 9048 kB' 'PageTables: 3680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114468 kB' 'Slab: 390628 kB' 'SReclaimable: 114468 kB' 'SUnreclaim: 276160 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.360 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.360 21:26:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # continue 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.361 21:26:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.361 21:26:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.361 21:26:19 -- setup/common.sh@33 -- # echo 0 00:03:41.361 21:26:19 -- setup/common.sh@33 -- # return 0 00:03:41.361 21:26:19 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.361 21:26:19 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.361 21:26:19 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.361 21:26:19 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:41.361 node0=512 expecting 512 00:03:41.361 21:26:19 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.361 21:26:19 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.361 21:26:19 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.361 21:26:19 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:41.361 node1=512 expecting 512 00:03:41.361 21:26:19 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:41.361 00:03:41.361 real 0m3.574s 00:03:41.361 user 0m1.364s 00:03:41.361 sys 0m2.279s 00:03:41.361 21:26:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:41.361 21:26:19 -- common/autotest_common.sh@10 -- # set +x 00:03:41.361 ************************************ 00:03:41.361 END TEST even_2G_alloc 00:03:41.361 ************************************ 00:03:41.361 21:26:19 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:41.361 21:26:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:41.361 21:26:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:41.361 21:26:19 -- common/autotest_common.sh@10 -- # set +x 00:03:41.361 ************************************ 00:03:41.361 START TEST odd_alloc 00:03:41.361 ************************************ 00:03:41.361 21:26:19 -- common/autotest_common.sh@1104 -- # odd_alloc 00:03:41.361 21:26:19 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:41.361 21:26:19 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:41.361 21:26:19 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:41.361 21:26:19 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:41.361 21:26:19 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:41.361 21:26:19 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:41.361 21:26:19 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:41.361 21:26:19 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:41.361 21:26:19 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:41.361 21:26:19 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:41.361 21:26:19 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:41.361 21:26:19 -- setup/hugepages.sh@83 -- # : 513 00:03:41.361 21:26:19 -- setup/hugepages.sh@84 -- # : 1 00:03:41.361 21:26:19 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:41.361 21:26:19 -- setup/hugepages.sh@83 -- # : 0 00:03:41.361 21:26:19 -- setup/hugepages.sh@84 -- # : 0 00:03:41.361 21:26:19 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:41.361 21:26:19 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:41.361 21:26:19 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:41.361 21:26:19 -- setup/hugepages.sh@160 -- # setup output 00:03:41.361 21:26:19 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:41.361 21:26:19 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:44.649 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:44.649 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:44.649 21:26:23 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:44.649 21:26:23 -- setup/hugepages.sh@89 -- # local node 00:03:44.649 21:26:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:44.649 21:26:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:44.649 21:26:23 -- setup/hugepages.sh@92 -- # local surp 00:03:44.649 21:26:23 -- setup/hugepages.sh@93 -- # local resv 00:03:44.649 21:26:23 -- setup/hugepages.sh@94 -- # local anon 00:03:44.649 21:26:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:44.649 21:26:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:44.649 21:26:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:44.649 21:26:23 -- setup/common.sh@18 -- # local node= 00:03:44.649 21:26:23 -- setup/common.sh@19 -- # local var val 00:03:44.649 21:26:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.649 21:26:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.649 21:26:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.649 21:26:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.649 21:26:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.649 21:26:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43508020 kB' 'MemAvailable: 47046992 kB' 'Buffers: 11500 kB' 'Cached: 10714504 kB' 'SwapCached: 0 kB' 'Active: 7830520 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349464 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548168 kB' 'Mapped: 173868 kB' 'Shmem: 6804268 kB' 'KReclaimable: 228732 kB' 'Slab: 737320 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 508588 kB' 'KernelStack: 21792 kB' 'PageTables: 7628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 8683376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213492 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.649 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.649 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.911 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.911 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.912 21:26:23 -- setup/common.sh@33 -- # echo 0 00:03:44.912 21:26:23 -- setup/common.sh@33 -- # return 0 00:03:44.912 21:26:23 -- setup/hugepages.sh@97 -- # anon=0 00:03:44.912 21:26:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:44.912 21:26:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.912 21:26:23 -- setup/common.sh@18 -- # local node= 00:03:44.912 21:26:23 -- setup/common.sh@19 -- # local var val 00:03:44.912 21:26:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.912 21:26:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.912 21:26:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.912 21:26:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.912 21:26:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.912 21:26:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43509432 kB' 'MemAvailable: 47048404 kB' 'Buffers: 11500 kB' 'Cached: 10714504 kB' 'SwapCached: 0 kB' 'Active: 7830100 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349044 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547740 kB' 'Mapped: 173844 kB' 'Shmem: 6804268 kB' 'KReclaimable: 228732 kB' 'Slab: 737256 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 508524 kB' 'KernelStack: 21840 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 8683388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213460 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.912 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.912 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.913 21:26:23 -- setup/common.sh@33 -- # echo 0 00:03:44.913 21:26:23 -- setup/common.sh@33 -- # return 0 00:03:44.913 21:26:23 -- setup/hugepages.sh@99 -- # surp=0 00:03:44.913 21:26:23 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:44.913 21:26:23 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:44.913 21:26:23 -- setup/common.sh@18 -- # local node= 00:03:44.913 21:26:23 -- setup/common.sh@19 -- # local var val 00:03:44.913 21:26:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.913 21:26:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.913 21:26:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.913 21:26:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.913 21:26:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.913 21:26:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43508536 kB' 'MemAvailable: 47047508 kB' 'Buffers: 11500 kB' 'Cached: 10714516 kB' 'SwapCached: 0 kB' 'Active: 7829772 kB' 'Inactive: 3440680 kB' 'Active(anon): 7348716 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547872 kB' 'Mapped: 173764 kB' 'Shmem: 6804280 kB' 'KReclaimable: 228732 kB' 'Slab: 737212 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 508480 kB' 'KernelStack: 21824 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 8683400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.913 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.913 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.914 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.914 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.915 21:26:23 -- setup/common.sh@33 -- # echo 0 00:03:44.915 21:26:23 -- setup/common.sh@33 -- # return 0 00:03:44.915 21:26:23 -- setup/hugepages.sh@100 -- # resv=0 00:03:44.915 21:26:23 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:44.915 nr_hugepages=1025 00:03:44.915 21:26:23 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:44.915 resv_hugepages=0 00:03:44.915 21:26:23 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:44.915 surplus_hugepages=0 00:03:44.915 21:26:23 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:44.915 anon_hugepages=0 00:03:44.915 21:26:23 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:44.915 21:26:23 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:44.915 21:26:23 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:44.915 21:26:23 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:44.915 21:26:23 -- setup/common.sh@18 -- # local node= 00:03:44.915 21:26:23 -- setup/common.sh@19 -- # local var val 00:03:44.915 21:26:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.915 21:26:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.915 21:26:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.915 21:26:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.915 21:26:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.915 21:26:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43509160 kB' 'MemAvailable: 47048132 kB' 'Buffers: 11500 kB' 'Cached: 10714532 kB' 'SwapCached: 0 kB' 'Active: 7829640 kB' 'Inactive: 3440680 kB' 'Active(anon): 7348584 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547728 kB' 'Mapped: 173764 kB' 'Shmem: 6804296 kB' 'KReclaimable: 228732 kB' 'Slab: 737212 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 508480 kB' 'KernelStack: 21840 kB' 'PageTables: 7852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 8683416 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.915 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.915 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.916 21:26:23 -- setup/common.sh@33 -- # echo 1025 00:03:44.916 21:26:23 -- setup/common.sh@33 -- # return 0 00:03:44.916 21:26:23 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:44.916 21:26:23 -- setup/hugepages.sh@112 -- # get_nodes 00:03:44.916 21:26:23 -- setup/hugepages.sh@27 -- # local node 00:03:44.916 21:26:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.916 21:26:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:44.916 21:26:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.916 21:26:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:44.916 21:26:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:44.916 21:26:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:44.916 21:26:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:44.916 21:26:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:44.916 21:26:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:44.916 21:26:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.916 21:26:23 -- setup/common.sh@18 -- # local node=0 00:03:44.916 21:26:23 -- setup/common.sh@19 -- # local var val 00:03:44.916 21:26:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.916 21:26:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.916 21:26:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:44.916 21:26:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:44.916 21:26:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.916 21:26:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 27447216 kB' 'MemUsed: 5144868 kB' 'SwapCached: 0 kB' 'Active: 2421032 kB' 'Inactive: 153644 kB' 'Active(anon): 2206792 kB' 'Inactive(anon): 0 kB' 'Active(file): 214240 kB' 'Inactive(file): 153644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2337076 kB' 'Mapped: 95316 kB' 'AnonPages: 240792 kB' 'Shmem: 1969192 kB' 'KernelStack: 12776 kB' 'PageTables: 4128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114256 kB' 'Slab: 346476 kB' 'SReclaimable: 114256 kB' 'SUnreclaim: 232220 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.916 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.916 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@33 -- # echo 0 00:03:44.917 21:26:23 -- setup/common.sh@33 -- # return 0 00:03:44.917 21:26:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:44.917 21:26:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:44.917 21:26:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:44.917 21:26:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:44.917 21:26:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.917 21:26:23 -- setup/common.sh@18 -- # local node=1 00:03:44.917 21:26:23 -- setup/common.sh@19 -- # local var val 00:03:44.917 21:26:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.917 21:26:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.917 21:26:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:44.917 21:26:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:44.917 21:26:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.917 21:26:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16061440 kB' 'MemUsed: 11641708 kB' 'SwapCached: 0 kB' 'Active: 5408292 kB' 'Inactive: 3287036 kB' 'Active(anon): 5141476 kB' 'Inactive(anon): 0 kB' 'Active(file): 266816 kB' 'Inactive(file): 3287036 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8388992 kB' 'Mapped: 78448 kB' 'AnonPages: 306548 kB' 'Shmem: 4835140 kB' 'KernelStack: 9048 kB' 'PageTables: 3668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114476 kB' 'Slab: 390736 kB' 'SReclaimable: 114476 kB' 'SUnreclaim: 276260 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.917 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.917 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # continue 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.918 21:26:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.918 21:26:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.918 21:26:23 -- setup/common.sh@33 -- # echo 0 00:03:44.918 21:26:23 -- setup/common.sh@33 -- # return 0 00:03:44.918 21:26:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:44.918 21:26:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:44.918 21:26:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:44.918 21:26:23 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:44.918 node0=512 expecting 513 00:03:44.918 21:26:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:44.918 21:26:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:44.918 21:26:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:44.918 21:26:23 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:44.918 node1=513 expecting 512 00:03:44.918 21:26:23 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:44.918 00:03:44.918 real 0m3.657s 00:03:44.918 user 0m1.398s 00:03:44.918 sys 0m2.331s 00:03:44.918 21:26:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:44.918 21:26:23 -- common/autotest_common.sh@10 -- # set +x 00:03:44.918 ************************************ 00:03:44.918 END TEST odd_alloc 00:03:44.918 ************************************ 00:03:44.918 21:26:23 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:44.918 21:26:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:44.918 21:26:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:44.918 21:26:23 -- common/autotest_common.sh@10 -- # set +x 00:03:44.918 ************************************ 00:03:44.918 START TEST custom_alloc 00:03:44.918 ************************************ 00:03:44.918 21:26:23 -- common/autotest_common.sh@1104 -- # custom_alloc 00:03:44.918 21:26:23 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:44.918 21:26:23 -- setup/hugepages.sh@169 -- # local node 00:03:44.918 21:26:23 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:44.918 21:26:23 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:44.918 21:26:23 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:44.918 21:26:23 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:44.918 21:26:23 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:44.918 21:26:23 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:44.918 21:26:23 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:44.918 21:26:23 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:44.918 21:26:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:44.918 21:26:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:44.918 21:26:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:44.918 21:26:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:44.918 21:26:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:44.918 21:26:23 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:44.918 21:26:23 -- setup/hugepages.sh@83 -- # : 256 00:03:44.918 21:26:23 -- setup/hugepages.sh@84 -- # : 1 00:03:44.918 21:26:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:44.918 21:26:23 -- setup/hugepages.sh@83 -- # : 0 00:03:44.918 21:26:23 -- setup/hugepages.sh@84 -- # : 0 00:03:44.918 21:26:23 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:44.918 21:26:23 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:44.918 21:26:23 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:44.918 21:26:23 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:44.918 21:26:23 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:44.918 21:26:23 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:44.918 21:26:23 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:44.918 21:26:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:44.919 21:26:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:44.919 21:26:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:44.919 21:26:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:44.919 21:26:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:44.919 21:26:23 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:44.919 21:26:23 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:44.919 21:26:23 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:44.919 21:26:23 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:44.919 21:26:23 -- setup/hugepages.sh@78 -- # return 0 00:03:44.919 21:26:23 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:44.919 21:26:23 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:44.919 21:26:23 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:44.919 21:26:23 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:44.919 21:26:23 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:44.919 21:26:23 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:44.919 21:26:23 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:44.919 21:26:23 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:44.919 21:26:23 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:44.919 21:26:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:44.919 21:26:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:44.919 21:26:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:44.919 21:26:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:44.919 21:26:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:44.919 21:26:23 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:44.919 21:26:23 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:44.919 21:26:23 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:44.919 21:26:23 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:44.919 21:26:23 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:44.919 21:26:23 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:44.919 21:26:23 -- setup/hugepages.sh@78 -- # return 0 00:03:44.919 21:26:23 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:44.919 21:26:23 -- setup/hugepages.sh@187 -- # setup output 00:03:44.919 21:26:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:44.919 21:26:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:48.207 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:48.207 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:48.470 21:26:26 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:48.470 21:26:26 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:48.470 21:26:26 -- setup/hugepages.sh@89 -- # local node 00:03:48.470 21:26:26 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:48.470 21:26:26 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:48.470 21:26:26 -- setup/hugepages.sh@92 -- # local surp 00:03:48.470 21:26:26 -- setup/hugepages.sh@93 -- # local resv 00:03:48.470 21:26:26 -- setup/hugepages.sh@94 -- # local anon 00:03:48.470 21:26:26 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:48.470 21:26:26 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:48.470 21:26:26 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:48.470 21:26:26 -- setup/common.sh@18 -- # local node= 00:03:48.470 21:26:26 -- setup/common.sh@19 -- # local var val 00:03:48.470 21:26:26 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.470 21:26:26 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.470 21:26:26 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.470 21:26:26 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.470 21:26:26 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.470 21:26:26 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.470 21:26:26 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.470 21:26:26 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.470 21:26:26 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42474076 kB' 'MemAvailable: 46013048 kB' 'Buffers: 11500 kB' 'Cached: 10714636 kB' 'SwapCached: 0 kB' 'Active: 7831604 kB' 'Inactive: 3440680 kB' 'Active(anon): 7350548 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548904 kB' 'Mapped: 173888 kB' 'Shmem: 6804400 kB' 'KReclaimable: 228732 kB' 'Slab: 736668 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507936 kB' 'KernelStack: 21856 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 8684036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.470 21:26:27 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.470 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.470 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.470 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.470 21:26:27 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.470 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.470 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.470 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.470 21:26:27 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.470 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.470 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.470 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.470 21:26:27 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.471 21:26:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:48.471 21:26:27 -- setup/common.sh@33 -- # echo 0 00:03:48.471 21:26:27 -- setup/common.sh@33 -- # return 0 00:03:48.471 21:26:27 -- setup/hugepages.sh@97 -- # anon=0 00:03:48.471 21:26:27 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:48.471 21:26:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.471 21:26:27 -- setup/common.sh@18 -- # local node= 00:03:48.471 21:26:27 -- setup/common.sh@19 -- # local var val 00:03:48.471 21:26:27 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.471 21:26:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.471 21:26:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.471 21:26:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.471 21:26:27 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.471 21:26:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.471 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42475060 kB' 'MemAvailable: 46014032 kB' 'Buffers: 11500 kB' 'Cached: 10714636 kB' 'SwapCached: 0 kB' 'Active: 7830512 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349456 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548280 kB' 'Mapped: 173768 kB' 'Shmem: 6804400 kB' 'KReclaimable: 228732 kB' 'Slab: 736656 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507924 kB' 'KernelStack: 21840 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 8684048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213364 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.472 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.472 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.473 21:26:27 -- setup/common.sh@33 -- # echo 0 00:03:48.473 21:26:27 -- setup/common.sh@33 -- # return 0 00:03:48.473 21:26:27 -- setup/hugepages.sh@99 -- # surp=0 00:03:48.473 21:26:27 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:48.473 21:26:27 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:48.473 21:26:27 -- setup/common.sh@18 -- # local node= 00:03:48.473 21:26:27 -- setup/common.sh@19 -- # local var val 00:03:48.473 21:26:27 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.473 21:26:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.473 21:26:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.473 21:26:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.473 21:26:27 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.473 21:26:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42475784 kB' 'MemAvailable: 46014756 kB' 'Buffers: 11500 kB' 'Cached: 10714636 kB' 'SwapCached: 0 kB' 'Active: 7830548 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349492 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548312 kB' 'Mapped: 173768 kB' 'Shmem: 6804400 kB' 'KReclaimable: 228732 kB' 'Slab: 736656 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507924 kB' 'KernelStack: 21856 kB' 'PageTables: 7912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 8684060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213364 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.473 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.473 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.474 21:26:27 -- setup/common.sh@33 -- # echo 0 00:03:48.474 21:26:27 -- setup/common.sh@33 -- # return 0 00:03:48.474 21:26:27 -- setup/hugepages.sh@100 -- # resv=0 00:03:48.474 21:26:27 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:48.474 nr_hugepages=1536 00:03:48.474 21:26:27 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:48.474 resv_hugepages=0 00:03:48.474 21:26:27 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:48.474 surplus_hugepages=0 00:03:48.474 21:26:27 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:48.474 anon_hugepages=0 00:03:48.474 21:26:27 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:48.474 21:26:27 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:48.474 21:26:27 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:48.474 21:26:27 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:48.474 21:26:27 -- setup/common.sh@18 -- # local node= 00:03:48.474 21:26:27 -- setup/common.sh@19 -- # local var val 00:03:48.474 21:26:27 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.474 21:26:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.474 21:26:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.474 21:26:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.474 21:26:27 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.474 21:26:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42476788 kB' 'MemAvailable: 46015760 kB' 'Buffers: 11500 kB' 'Cached: 10714640 kB' 'SwapCached: 0 kB' 'Active: 7830396 kB' 'Inactive: 3440680 kB' 'Active(anon): 7349340 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548152 kB' 'Mapped: 173768 kB' 'Shmem: 6804404 kB' 'KReclaimable: 228732 kB' 'Slab: 736656 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507924 kB' 'KernelStack: 21840 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 8684076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213364 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.474 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.474 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.475 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.475 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.476 21:26:27 -- setup/common.sh@33 -- # echo 1536 00:03:48.476 21:26:27 -- setup/common.sh@33 -- # return 0 00:03:48.476 21:26:27 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:48.476 21:26:27 -- setup/hugepages.sh@112 -- # get_nodes 00:03:48.476 21:26:27 -- setup/hugepages.sh@27 -- # local node 00:03:48.476 21:26:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.476 21:26:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:48.476 21:26:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.476 21:26:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:48.476 21:26:27 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:48.476 21:26:27 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:48.476 21:26:27 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.476 21:26:27 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.476 21:26:27 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:48.476 21:26:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.476 21:26:27 -- setup/common.sh@18 -- # local node=0 00:03:48.476 21:26:27 -- setup/common.sh@19 -- # local var val 00:03:48.476 21:26:27 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.476 21:26:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.476 21:26:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:48.476 21:26:27 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:48.476 21:26:27 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.476 21:26:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 27455324 kB' 'MemUsed: 5136760 kB' 'SwapCached: 0 kB' 'Active: 2421224 kB' 'Inactive: 153644 kB' 'Active(anon): 2206984 kB' 'Inactive(anon): 0 kB' 'Active(file): 214240 kB' 'Inactive(file): 153644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2337104 kB' 'Mapped: 95316 kB' 'AnonPages: 240908 kB' 'Shmem: 1969220 kB' 'KernelStack: 12760 kB' 'PageTables: 4084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114256 kB' 'Slab: 346192 kB' 'SReclaimable: 114256 kB' 'SUnreclaim: 231936 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.476 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.476 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@33 -- # echo 0 00:03:48.477 21:26:27 -- setup/common.sh@33 -- # return 0 00:03:48.477 21:26:27 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.477 21:26:27 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.477 21:26:27 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.477 21:26:27 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:48.477 21:26:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.477 21:26:27 -- setup/common.sh@18 -- # local node=1 00:03:48.477 21:26:27 -- setup/common.sh@19 -- # local var val 00:03:48.477 21:26:27 -- setup/common.sh@20 -- # local mem_f mem 00:03:48.477 21:26:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.477 21:26:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:48.477 21:26:27 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:48.477 21:26:27 -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.477 21:26:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 15021588 kB' 'MemUsed: 12681560 kB' 'SwapCached: 0 kB' 'Active: 5409292 kB' 'Inactive: 3287036 kB' 'Active(anon): 5142476 kB' 'Inactive(anon): 0 kB' 'Active(file): 266816 kB' 'Inactive(file): 3287036 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8389096 kB' 'Mapped: 78452 kB' 'AnonPages: 307316 kB' 'Shmem: 4835244 kB' 'KernelStack: 9064 kB' 'PageTables: 3724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114476 kB' 'Slab: 390464 kB' 'SReclaimable: 114476 kB' 'SUnreclaim: 275988 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.477 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.477 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # continue 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # IFS=': ' 00:03:48.478 21:26:27 -- setup/common.sh@31 -- # read -r var val _ 00:03:48.478 21:26:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.478 21:26:27 -- setup/common.sh@33 -- # echo 0 00:03:48.478 21:26:27 -- setup/common.sh@33 -- # return 0 00:03:48.478 21:26:27 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.478 21:26:27 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.478 21:26:27 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.478 21:26:27 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.478 21:26:27 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:48.478 node0=512 expecting 512 00:03:48.478 21:26:27 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.478 21:26:27 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.478 21:26:27 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.478 21:26:27 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:48.478 node1=1024 expecting 1024 00:03:48.478 21:26:27 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:48.478 00:03:48.478 real 0m3.501s 00:03:48.478 user 0m1.260s 00:03:48.478 sys 0m2.288s 00:03:48.478 21:26:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:48.478 21:26:27 -- common/autotest_common.sh@10 -- # set +x 00:03:48.478 ************************************ 00:03:48.478 END TEST custom_alloc 00:03:48.478 ************************************ 00:03:48.478 21:26:27 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:48.478 21:26:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:48.478 21:26:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:48.478 21:26:27 -- common/autotest_common.sh@10 -- # set +x 00:03:48.478 ************************************ 00:03:48.478 START TEST no_shrink_alloc 00:03:48.478 ************************************ 00:03:48.478 21:26:27 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:03:48.478 21:26:27 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:48.478 21:26:27 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:48.478 21:26:27 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:48.478 21:26:27 -- setup/hugepages.sh@51 -- # shift 00:03:48.478 21:26:27 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:48.478 21:26:27 -- setup/hugepages.sh@52 -- # local node_ids 00:03:48.478 21:26:27 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:48.478 21:26:27 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:48.478 21:26:27 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:48.478 21:26:27 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:48.478 21:26:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:48.478 21:26:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:48.478 21:26:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:48.478 21:26:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:48.478 21:26:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:48.478 21:26:27 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:48.478 21:26:27 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:48.478 21:26:27 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:48.478 21:26:27 -- setup/hugepages.sh@73 -- # return 0 00:03:48.478 21:26:27 -- setup/hugepages.sh@198 -- # setup output 00:03:48.478 21:26:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:48.478 21:26:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:51.766 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:51.766 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:52.028 21:26:30 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:52.028 21:26:30 -- setup/hugepages.sh@89 -- # local node 00:03:52.028 21:26:30 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:52.028 21:26:30 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:52.028 21:26:30 -- setup/hugepages.sh@92 -- # local surp 00:03:52.028 21:26:30 -- setup/hugepages.sh@93 -- # local resv 00:03:52.028 21:26:30 -- setup/hugepages.sh@94 -- # local anon 00:03:52.028 21:26:30 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:52.028 21:26:30 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:52.028 21:26:30 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:52.028 21:26:30 -- setup/common.sh@18 -- # local node= 00:03:52.028 21:26:30 -- setup/common.sh@19 -- # local var val 00:03:52.028 21:26:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.028 21:26:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.028 21:26:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.028 21:26:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.028 21:26:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.028 21:26:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43527116 kB' 'MemAvailable: 47066088 kB' 'Buffers: 11500 kB' 'Cached: 10714764 kB' 'SwapCached: 0 kB' 'Active: 7832668 kB' 'Inactive: 3440680 kB' 'Active(anon): 7351612 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 549352 kB' 'Mapped: 173912 kB' 'Shmem: 6804528 kB' 'KReclaimable: 228732 kB' 'Slab: 736508 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507776 kB' 'KernelStack: 22016 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8689356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213524 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.028 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.028 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:52.029 21:26:30 -- setup/common.sh@33 -- # echo 0 00:03:52.029 21:26:30 -- setup/common.sh@33 -- # return 0 00:03:52.029 21:26:30 -- setup/hugepages.sh@97 -- # anon=0 00:03:52.029 21:26:30 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:52.029 21:26:30 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.029 21:26:30 -- setup/common.sh@18 -- # local node= 00:03:52.029 21:26:30 -- setup/common.sh@19 -- # local var val 00:03:52.029 21:26:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.029 21:26:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.029 21:26:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.029 21:26:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.029 21:26:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.029 21:26:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43526152 kB' 'MemAvailable: 47065124 kB' 'Buffers: 11500 kB' 'Cached: 10714768 kB' 'SwapCached: 0 kB' 'Active: 7832464 kB' 'Inactive: 3440680 kB' 'Active(anon): 7351408 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 549684 kB' 'Mapped: 173856 kB' 'Shmem: 6804532 kB' 'KReclaimable: 228732 kB' 'Slab: 736492 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507760 kB' 'KernelStack: 21952 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8689372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213508 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.029 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.029 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.030 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.030 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.031 21:26:30 -- setup/common.sh@33 -- # echo 0 00:03:52.031 21:26:30 -- setup/common.sh@33 -- # return 0 00:03:52.031 21:26:30 -- setup/hugepages.sh@99 -- # surp=0 00:03:52.031 21:26:30 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:52.031 21:26:30 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:52.031 21:26:30 -- setup/common.sh@18 -- # local node= 00:03:52.031 21:26:30 -- setup/common.sh@19 -- # local var val 00:03:52.031 21:26:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.031 21:26:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.031 21:26:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.031 21:26:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.031 21:26:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.031 21:26:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43526156 kB' 'MemAvailable: 47065128 kB' 'Buffers: 11500 kB' 'Cached: 10714780 kB' 'SwapCached: 0 kB' 'Active: 7831464 kB' 'Inactive: 3440680 kB' 'Active(anon): 7350408 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 549140 kB' 'Mapped: 173776 kB' 'Shmem: 6804544 kB' 'KReclaimable: 228732 kB' 'Slab: 736464 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507732 kB' 'KernelStack: 22000 kB' 'PageTables: 8320 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8687868 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213492 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.031 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.031 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:52.032 21:26:30 -- setup/common.sh@33 -- # echo 0 00:03:52.032 21:26:30 -- setup/common.sh@33 -- # return 0 00:03:52.032 21:26:30 -- setup/hugepages.sh@100 -- # resv=0 00:03:52.032 21:26:30 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:52.032 nr_hugepages=1024 00:03:52.032 21:26:30 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:52.032 resv_hugepages=0 00:03:52.032 21:26:30 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:52.032 surplus_hugepages=0 00:03:52.032 21:26:30 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:52.032 anon_hugepages=0 00:03:52.032 21:26:30 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:52.032 21:26:30 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:52.032 21:26:30 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:52.032 21:26:30 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:52.032 21:26:30 -- setup/common.sh@18 -- # local node= 00:03:52.032 21:26:30 -- setup/common.sh@19 -- # local var val 00:03:52.032 21:26:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.032 21:26:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.032 21:26:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.032 21:26:30 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.032 21:26:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.032 21:26:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43525988 kB' 'MemAvailable: 47064960 kB' 'Buffers: 11500 kB' 'Cached: 10714780 kB' 'SwapCached: 0 kB' 'Active: 7831300 kB' 'Inactive: 3440680 kB' 'Active(anon): 7350244 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 548980 kB' 'Mapped: 173776 kB' 'Shmem: 6804544 kB' 'KReclaimable: 228732 kB' 'Slab: 736464 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507732 kB' 'KernelStack: 21920 kB' 'PageTables: 8252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8689400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213524 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.032 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.032 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.033 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.033 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:52.033 21:26:30 -- setup/common.sh@33 -- # echo 1024 00:03:52.033 21:26:30 -- setup/common.sh@33 -- # return 0 00:03:52.033 21:26:30 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:52.033 21:26:30 -- setup/hugepages.sh@112 -- # get_nodes 00:03:52.033 21:26:30 -- setup/hugepages.sh@27 -- # local node 00:03:52.033 21:26:30 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.033 21:26:30 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:52.033 21:26:30 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.033 21:26:30 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:52.033 21:26:30 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:52.033 21:26:30 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:52.033 21:26:30 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:52.033 21:26:30 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:52.033 21:26:30 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:52.033 21:26:30 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:52.033 21:26:30 -- setup/common.sh@18 -- # local node=0 00:03:52.034 21:26:30 -- setup/common.sh@19 -- # local var val 00:03:52.034 21:26:30 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.034 21:26:30 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.034 21:26:30 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:52.034 21:26:30 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:52.034 21:26:30 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.034 21:26:30 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26425044 kB' 'MemUsed: 6167040 kB' 'SwapCached: 0 kB' 'Active: 2422828 kB' 'Inactive: 153644 kB' 'Active(anon): 2208588 kB' 'Inactive(anon): 0 kB' 'Active(file): 214240 kB' 'Inactive(file): 153644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2337156 kB' 'Mapped: 95316 kB' 'AnonPages: 242508 kB' 'Shmem: 1969272 kB' 'KernelStack: 12952 kB' 'PageTables: 4744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114256 kB' 'Slab: 345996 kB' 'SReclaimable: 114256 kB' 'SUnreclaim: 231740 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # continue 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.034 21:26:30 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.034 21:26:30 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:52.034 21:26:30 -- setup/common.sh@33 -- # echo 0 00:03:52.034 21:26:30 -- setup/common.sh@33 -- # return 0 00:03:52.034 21:26:30 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:52.034 21:26:30 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:52.034 21:26:30 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:52.034 21:26:30 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:52.034 21:26:30 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:52.034 node0=1024 expecting 1024 00:03:52.035 21:26:30 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:52.035 21:26:30 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:52.035 21:26:30 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:52.035 21:26:30 -- setup/hugepages.sh@202 -- # setup output 00:03:52.035 21:26:30 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.035 21:26:30 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:55.325 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:55.325 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:55.325 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:55.325 21:26:33 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:55.325 21:26:33 -- setup/hugepages.sh@89 -- # local node 00:03:55.325 21:26:33 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:55.325 21:26:33 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:55.325 21:26:33 -- setup/hugepages.sh@92 -- # local surp 00:03:55.325 21:26:33 -- setup/hugepages.sh@93 -- # local resv 00:03:55.325 21:26:33 -- setup/hugepages.sh@94 -- # local anon 00:03:55.325 21:26:33 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:55.325 21:26:33 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:55.325 21:26:33 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:55.325 21:26:33 -- setup/common.sh@18 -- # local node= 00:03:55.325 21:26:33 -- setup/common.sh@19 -- # local var val 00:03:55.325 21:26:33 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.325 21:26:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.325 21:26:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.325 21:26:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.325 21:26:33 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.325 21:26:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.325 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.325 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43536768 kB' 'MemAvailable: 47075740 kB' 'Buffers: 11500 kB' 'Cached: 10714880 kB' 'SwapCached: 0 kB' 'Active: 7831816 kB' 'Inactive: 3440680 kB' 'Active(anon): 7350760 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 549224 kB' 'Mapped: 173816 kB' 'Shmem: 6804644 kB' 'KReclaimable: 228732 kB' 'Slab: 736460 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 507728 kB' 'KernelStack: 21904 kB' 'PageTables: 7184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8688484 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213620 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:33 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:33 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.326 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.326 21:26:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:55.326 21:26:34 -- setup/common.sh@33 -- # echo 0 00:03:55.326 21:26:34 -- setup/common.sh@33 -- # return 0 00:03:55.327 21:26:34 -- setup/hugepages.sh@97 -- # anon=0 00:03:55.327 21:26:34 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:55.327 21:26:34 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:55.327 21:26:34 -- setup/common.sh@18 -- # local node= 00:03:55.327 21:26:34 -- setup/common.sh@19 -- # local var val 00:03:55.327 21:26:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.327 21:26:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.327 21:26:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.327 21:26:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.327 21:26:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.327 21:26:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43539092 kB' 'MemAvailable: 47078064 kB' 'Buffers: 11500 kB' 'Cached: 10714880 kB' 'SwapCached: 0 kB' 'Active: 7832872 kB' 'Inactive: 3440680 kB' 'Active(anon): 7351816 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 550352 kB' 'Mapped: 173936 kB' 'Shmem: 6804644 kB' 'KReclaimable: 228732 kB' 'Slab: 737016 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 508284 kB' 'KernelStack: 22016 kB' 'PageTables: 8332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8690012 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213700 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.327 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.327 21:26:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.328 21:26:34 -- setup/common.sh@33 -- # echo 0 00:03:55.328 21:26:34 -- setup/common.sh@33 -- # return 0 00:03:55.328 21:26:34 -- setup/hugepages.sh@99 -- # surp=0 00:03:55.328 21:26:34 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:55.328 21:26:34 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:55.328 21:26:34 -- setup/common.sh@18 -- # local node= 00:03:55.328 21:26:34 -- setup/common.sh@19 -- # local var val 00:03:55.328 21:26:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.328 21:26:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.328 21:26:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.328 21:26:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.328 21:26:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.328 21:26:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43540148 kB' 'MemAvailable: 47079120 kB' 'Buffers: 11500 kB' 'Cached: 10714892 kB' 'SwapCached: 0 kB' 'Active: 7832968 kB' 'Inactive: 3440680 kB' 'Active(anon): 7351912 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 550412 kB' 'Mapped: 173820 kB' 'Shmem: 6804656 kB' 'KReclaimable: 228732 kB' 'Slab: 736952 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 508220 kB' 'KernelStack: 22288 kB' 'PageTables: 8968 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8690028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213684 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.328 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.328 21:26:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:55.329 21:26:34 -- setup/common.sh@33 -- # echo 0 00:03:55.329 21:26:34 -- setup/common.sh@33 -- # return 0 00:03:55.329 21:26:34 -- setup/hugepages.sh@100 -- # resv=0 00:03:55.329 21:26:34 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:55.329 nr_hugepages=1024 00:03:55.329 21:26:34 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:55.329 resv_hugepages=0 00:03:55.329 21:26:34 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:55.329 surplus_hugepages=0 00:03:55.329 21:26:34 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:55.329 anon_hugepages=0 00:03:55.329 21:26:34 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:55.329 21:26:34 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:55.329 21:26:34 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:55.329 21:26:34 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:55.329 21:26:34 -- setup/common.sh@18 -- # local node= 00:03:55.329 21:26:34 -- setup/common.sh@19 -- # local var val 00:03:55.329 21:26:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.329 21:26:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.329 21:26:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:55.329 21:26:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:55.329 21:26:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.329 21:26:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.329 21:26:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 43544108 kB' 'MemAvailable: 47083080 kB' 'Buffers: 11500 kB' 'Cached: 10714912 kB' 'SwapCached: 0 kB' 'Active: 7832596 kB' 'Inactive: 3440680 kB' 'Active(anon): 7351540 kB' 'Inactive(anon): 0 kB' 'Active(file): 481056 kB' 'Inactive(file): 3440680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 550144 kB' 'Mapped: 173808 kB' 'Shmem: 6804676 kB' 'KReclaimable: 228732 kB' 'Slab: 737216 kB' 'SReclaimable: 228732 kB' 'SUnreclaim: 508484 kB' 'KernelStack: 22096 kB' 'PageTables: 8664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 8690040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213700 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.329 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.329 21:26:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.330 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.330 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.590 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.590 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:55.591 21:26:34 -- setup/common.sh@33 -- # echo 1024 00:03:55.591 21:26:34 -- setup/common.sh@33 -- # return 0 00:03:55.591 21:26:34 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:55.591 21:26:34 -- setup/hugepages.sh@112 -- # get_nodes 00:03:55.591 21:26:34 -- setup/hugepages.sh@27 -- # local node 00:03:55.591 21:26:34 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:55.591 21:26:34 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:55.591 21:26:34 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:55.591 21:26:34 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:55.591 21:26:34 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:55.591 21:26:34 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:55.591 21:26:34 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:55.591 21:26:34 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:55.591 21:26:34 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:55.591 21:26:34 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:55.591 21:26:34 -- setup/common.sh@18 -- # local node=0 00:03:55.591 21:26:34 -- setup/common.sh@19 -- # local var val 00:03:55.591 21:26:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:55.591 21:26:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:55.591 21:26:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:55.591 21:26:34 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:55.591 21:26:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:55.591 21:26:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26436920 kB' 'MemUsed: 6155164 kB' 'SwapCached: 0 kB' 'Active: 2421616 kB' 'Inactive: 153644 kB' 'Active(anon): 2207376 kB' 'Inactive(anon): 0 kB' 'Active(file): 214240 kB' 'Inactive(file): 153644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2337188 kB' 'Mapped: 95316 kB' 'AnonPages: 241208 kB' 'Shmem: 1969304 kB' 'KernelStack: 12808 kB' 'PageTables: 4136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 114256 kB' 'Slab: 346228 kB' 'SReclaimable: 114256 kB' 'SUnreclaim: 231972 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.591 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.591 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # continue 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:55.592 21:26:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:55.592 21:26:34 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:55.592 21:26:34 -- setup/common.sh@33 -- # echo 0 00:03:55.592 21:26:34 -- setup/common.sh@33 -- # return 0 00:03:55.592 21:26:34 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:55.592 21:26:34 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:55.592 21:26:34 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:55.592 21:26:34 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:55.592 21:26:34 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:55.592 node0=1024 expecting 1024 00:03:55.592 21:26:34 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:55.592 00:03:55.592 real 0m6.958s 00:03:55.592 user 0m2.613s 00:03:55.592 sys 0m4.453s 00:03:55.592 21:26:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:55.592 21:26:34 -- common/autotest_common.sh@10 -- # set +x 00:03:55.592 ************************************ 00:03:55.592 END TEST no_shrink_alloc 00:03:55.592 ************************************ 00:03:55.592 21:26:34 -- setup/hugepages.sh@217 -- # clear_hp 00:03:55.592 21:26:34 -- setup/hugepages.sh@37 -- # local node hp 00:03:55.592 21:26:34 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:55.592 21:26:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.592 21:26:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:55.592 21:26:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.592 21:26:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:55.592 21:26:34 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:55.592 21:26:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.592 21:26:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:55.592 21:26:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:55.592 21:26:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:55.592 21:26:34 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:55.592 21:26:34 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:55.592 00:03:55.592 real 0m26.826s 00:03:55.592 user 0m9.475s 00:03:55.592 sys 0m16.160s 00:03:55.592 21:26:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:55.592 21:26:34 -- common/autotest_common.sh@10 -- # set +x 00:03:55.592 ************************************ 00:03:55.592 END TEST hugepages 00:03:55.592 ************************************ 00:03:55.592 21:26:34 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:55.592 21:26:34 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:55.592 21:26:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:55.592 21:26:34 -- common/autotest_common.sh@10 -- # set +x 00:03:55.592 ************************************ 00:03:55.592 START TEST driver 00:03:55.592 ************************************ 00:03:55.592 21:26:34 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:55.592 * Looking for test storage... 00:03:55.592 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:55.592 21:26:34 -- setup/driver.sh@68 -- # setup reset 00:03:55.592 21:26:34 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:55.592 21:26:34 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:00.866 21:26:38 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:00.866 21:26:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:00.866 21:26:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:00.866 21:26:38 -- common/autotest_common.sh@10 -- # set +x 00:04:00.866 ************************************ 00:04:00.866 START TEST guess_driver 00:04:00.866 ************************************ 00:04:00.866 21:26:38 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:00.866 21:26:38 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:00.866 21:26:38 -- setup/driver.sh@47 -- # local fail=0 00:04:00.866 21:26:38 -- setup/driver.sh@49 -- # pick_driver 00:04:00.866 21:26:38 -- setup/driver.sh@36 -- # vfio 00:04:00.866 21:26:38 -- setup/driver.sh@21 -- # local iommu_grups 00:04:00.866 21:26:38 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:00.866 21:26:38 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:00.866 21:26:38 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:00.866 21:26:38 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:00.866 21:26:38 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:00.866 21:26:38 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:00.866 21:26:38 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:00.866 21:26:38 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:00.866 21:26:38 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:00.866 21:26:38 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:00.866 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:00.866 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:00.866 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:00.866 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:00.866 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:00.866 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:00.866 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:00.866 21:26:38 -- setup/driver.sh@30 -- # return 0 00:04:00.866 21:26:38 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:00.866 21:26:38 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:00.866 21:26:38 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:00.866 21:26:38 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:00.866 Looking for driver=vfio-pci 00:04:00.866 21:26:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.866 21:26:38 -- setup/driver.sh@45 -- # setup output config 00:04:00.866 21:26:38 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.866 21:26:38 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.156 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.156 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.156 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.157 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.157 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.157 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.157 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.157 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.157 21:26:42 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:04.157 21:26:42 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:04.157 21:26:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.532 21:26:43 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:05.532 21:26:43 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:05.532 21:26:43 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:05.532 21:26:44 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:05.532 21:26:44 -- setup/driver.sh@65 -- # setup reset 00:04:05.532 21:26:44 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:05.532 21:26:44 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:10.824 00:04:10.824 real 0m9.773s 00:04:10.824 user 0m2.635s 00:04:10.824 sys 0m4.858s 00:04:10.824 21:26:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:10.824 21:26:48 -- common/autotest_common.sh@10 -- # set +x 00:04:10.824 ************************************ 00:04:10.824 END TEST guess_driver 00:04:10.824 ************************************ 00:04:10.824 00:04:10.824 real 0m14.489s 00:04:10.824 user 0m3.981s 00:04:10.824 sys 0m7.466s 00:04:10.824 21:26:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:10.824 21:26:48 -- common/autotest_common.sh@10 -- # set +x 00:04:10.824 ************************************ 00:04:10.824 END TEST driver 00:04:10.824 ************************************ 00:04:10.824 21:26:48 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:10.824 21:26:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:10.824 21:26:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:10.824 21:26:48 -- common/autotest_common.sh@10 -- # set +x 00:04:10.824 ************************************ 00:04:10.824 START TEST devices 00:04:10.824 ************************************ 00:04:10.824 21:26:48 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:10.824 * Looking for test storage... 00:04:10.824 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:10.824 21:26:48 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:10.824 21:26:48 -- setup/devices.sh@192 -- # setup reset 00:04:10.824 21:26:48 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:10.824 21:26:48 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:14.113 21:26:52 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:14.113 21:26:52 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:14.113 21:26:52 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:14.113 21:26:52 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:14.113 21:26:52 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:14.113 21:26:52 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:14.113 21:26:52 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:14.113 21:26:52 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:14.113 21:26:52 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:14.113 21:26:52 -- setup/devices.sh@196 -- # blocks=() 00:04:14.113 21:26:52 -- setup/devices.sh@196 -- # declare -a blocks 00:04:14.113 21:26:52 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:14.113 21:26:52 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:14.113 21:26:52 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:14.113 21:26:52 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:14.113 21:26:52 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:14.113 21:26:52 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:14.113 21:26:52 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:14.113 21:26:52 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:14.114 21:26:52 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:14.114 21:26:52 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:14.114 21:26:52 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:14.114 No valid GPT data, bailing 00:04:14.114 21:26:52 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:14.114 21:26:52 -- scripts/common.sh@393 -- # pt= 00:04:14.114 21:26:52 -- scripts/common.sh@394 -- # return 1 00:04:14.114 21:26:52 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:14.114 21:26:52 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:14.114 21:26:52 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:14.114 21:26:52 -- setup/common.sh@80 -- # echo 1600321314816 00:04:14.114 21:26:52 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:14.114 21:26:52 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:14.114 21:26:52 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:14.114 21:26:52 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:14.114 21:26:52 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:14.114 21:26:52 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:14.114 21:26:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:14.114 21:26:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:14.114 21:26:52 -- common/autotest_common.sh@10 -- # set +x 00:04:14.114 ************************************ 00:04:14.114 START TEST nvme_mount 00:04:14.114 ************************************ 00:04:14.114 21:26:52 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:14.114 21:26:52 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:14.114 21:26:52 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:14.114 21:26:52 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:14.114 21:26:52 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:14.114 21:26:52 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:14.114 21:26:52 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:14.114 21:26:52 -- setup/common.sh@40 -- # local part_no=1 00:04:14.114 21:26:52 -- setup/common.sh@41 -- # local size=1073741824 00:04:14.114 21:26:52 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:14.114 21:26:52 -- setup/common.sh@44 -- # parts=() 00:04:14.114 21:26:52 -- setup/common.sh@44 -- # local parts 00:04:14.114 21:26:52 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:14.114 21:26:52 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:14.114 21:26:52 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:14.114 21:26:52 -- setup/common.sh@46 -- # (( part++ )) 00:04:14.114 21:26:52 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:14.114 21:26:52 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:14.114 21:26:52 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:14.114 21:26:52 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:15.048 Creating new GPT entries in memory. 00:04:15.048 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:15.048 other utilities. 00:04:15.048 21:26:53 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:15.048 21:26:53 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:15.048 21:26:53 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:15.048 21:26:53 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:15.048 21:26:53 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:15.986 Creating new GPT entries in memory. 00:04:15.986 The operation has completed successfully. 00:04:15.986 21:26:54 -- setup/common.sh@57 -- # (( part++ )) 00:04:15.986 21:26:54 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:15.986 21:26:54 -- setup/common.sh@62 -- # wait 3537361 00:04:15.986 21:26:54 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.986 21:26:54 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:15.986 21:26:54 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.986 21:26:54 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:15.986 21:26:54 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:15.986 21:26:54 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.986 21:26:54 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.986 21:26:54 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:15.986 21:26:54 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:15.986 21:26:54 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.986 21:26:54 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.986 21:26:54 -- setup/devices.sh@53 -- # local found=0 00:04:15.986 21:26:54 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:15.986 21:26:54 -- setup/devices.sh@56 -- # : 00:04:15.986 21:26:54 -- setup/devices.sh@59 -- # local pci status 00:04:15.986 21:26:54 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.986 21:26:54 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:15.986 21:26:54 -- setup/devices.sh@47 -- # setup output config 00:04:15.986 21:26:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.986 21:26:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:19.276 21:26:57 -- setup/devices.sh@63 -- # found=1 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:57 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:57 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.276 21:26:58 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:19.276 21:26:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.535 21:26:58 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:19.535 21:26:58 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:19.535 21:26:58 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:19.535 21:26:58 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:19.535 21:26:58 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:19.535 21:26:58 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:19.535 21:26:58 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:19.535 21:26:58 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:19.535 21:26:58 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:19.535 21:26:58 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:19.535 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:19.535 21:26:58 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:19.535 21:26:58 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:19.795 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:19.795 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:19.795 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:19.795 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:19.795 21:26:58 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:19.795 21:26:58 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:19.795 21:26:58 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:19.795 21:26:58 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:19.795 21:26:58 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:19.795 21:26:58 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:19.795 21:26:58 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:19.795 21:26:58 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:19.795 21:26:58 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:19.795 21:26:58 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:19.795 21:26:58 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:19.795 21:26:58 -- setup/devices.sh@53 -- # local found=0 00:04:19.795 21:26:58 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:19.795 21:26:58 -- setup/devices.sh@56 -- # : 00:04:19.795 21:26:58 -- setup/devices.sh@59 -- # local pci status 00:04:19.795 21:26:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:19.795 21:26:58 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:19.795 21:26:58 -- setup/devices.sh@47 -- # setup output config 00:04:19.795 21:26:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:19.795 21:26:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:23.086 21:27:01 -- setup/devices.sh@63 -- # found=1 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:23.086 21:27:01 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:23.086 21:27:01 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:23.086 21:27:01 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:23.086 21:27:01 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:23.086 21:27:01 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:23.086 21:27:01 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:23.086 21:27:01 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:23.086 21:27:01 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:23.086 21:27:01 -- setup/devices.sh@50 -- # local mount_point= 00:04:23.086 21:27:01 -- setup/devices.sh@51 -- # local test_file= 00:04:23.086 21:27:01 -- setup/devices.sh@53 -- # local found=0 00:04:23.086 21:27:01 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:23.086 21:27:01 -- setup/devices.sh@59 -- # local pci status 00:04:23.086 21:27:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:23.086 21:27:01 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:23.086 21:27:01 -- setup/devices.sh@47 -- # setup output config 00:04:23.086 21:27:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:23.086 21:27:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:26.375 21:27:04 -- setup/devices.sh@63 -- # found=1 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:26.375 21:27:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:26.375 21:27:05 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:26.375 21:27:05 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:26.375 21:27:05 -- setup/devices.sh@68 -- # return 0 00:04:26.375 21:27:05 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:26.375 21:27:05 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:26.375 21:27:05 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:26.375 21:27:05 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:26.375 21:27:05 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:26.375 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:26.375 00:04:26.375 real 0m12.488s 00:04:26.375 user 0m3.583s 00:04:26.375 sys 0m6.792s 00:04:26.375 21:27:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:26.375 21:27:05 -- common/autotest_common.sh@10 -- # set +x 00:04:26.375 ************************************ 00:04:26.375 END TEST nvme_mount 00:04:26.375 ************************************ 00:04:26.375 21:27:05 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:26.375 21:27:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:26.375 21:27:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:26.375 21:27:05 -- common/autotest_common.sh@10 -- # set +x 00:04:26.375 ************************************ 00:04:26.375 START TEST dm_mount 00:04:26.375 ************************************ 00:04:26.375 21:27:05 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:26.375 21:27:05 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:26.375 21:27:05 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:26.375 21:27:05 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:26.375 21:27:05 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:26.375 21:27:05 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:26.375 21:27:05 -- setup/common.sh@40 -- # local part_no=2 00:04:26.375 21:27:05 -- setup/common.sh@41 -- # local size=1073741824 00:04:26.375 21:27:05 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:26.375 21:27:05 -- setup/common.sh@44 -- # parts=() 00:04:26.375 21:27:05 -- setup/common.sh@44 -- # local parts 00:04:26.375 21:27:05 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:26.375 21:27:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:26.375 21:27:05 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:26.375 21:27:05 -- setup/common.sh@46 -- # (( part++ )) 00:04:26.375 21:27:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:26.375 21:27:05 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:26.375 21:27:05 -- setup/common.sh@46 -- # (( part++ )) 00:04:26.375 21:27:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:26.375 21:27:05 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:26.375 21:27:05 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:26.375 21:27:05 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:27.444 Creating new GPT entries in memory. 00:04:27.444 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:27.444 other utilities. 00:04:27.444 21:27:06 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:27.444 21:27:06 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:27.444 21:27:06 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:27.444 21:27:06 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:27.444 21:27:06 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:28.380 Creating new GPT entries in memory. 00:04:28.380 The operation has completed successfully. 00:04:28.380 21:27:07 -- setup/common.sh@57 -- # (( part++ )) 00:04:28.380 21:27:07 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:28.380 21:27:07 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:28.380 21:27:07 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:28.380 21:27:07 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:29.757 The operation has completed successfully. 00:04:29.757 21:27:08 -- setup/common.sh@57 -- # (( part++ )) 00:04:29.757 21:27:08 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:29.757 21:27:08 -- setup/common.sh@62 -- # wait 3542144 00:04:29.757 21:27:08 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:29.757 21:27:08 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:29.757 21:27:08 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:29.757 21:27:08 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:29.757 21:27:08 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:29.757 21:27:08 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:29.757 21:27:08 -- setup/devices.sh@161 -- # break 00:04:29.757 21:27:08 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:29.757 21:27:08 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:29.757 21:27:08 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:29.757 21:27:08 -- setup/devices.sh@166 -- # dm=dm-0 00:04:29.757 21:27:08 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:29.757 21:27:08 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:29.757 21:27:08 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:29.757 21:27:08 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:29.757 21:27:08 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:29.757 21:27:08 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:29.757 21:27:08 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:29.757 21:27:08 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:29.757 21:27:08 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:29.757 21:27:08 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:29.757 21:27:08 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:29.757 21:27:08 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:29.757 21:27:08 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:29.757 21:27:08 -- setup/devices.sh@53 -- # local found=0 00:04:29.757 21:27:08 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:29.757 21:27:08 -- setup/devices.sh@56 -- # : 00:04:29.757 21:27:08 -- setup/devices.sh@59 -- # local pci status 00:04:29.757 21:27:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:29.757 21:27:08 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:29.757 21:27:08 -- setup/devices.sh@47 -- # setup output config 00:04:29.757 21:27:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:29.757 21:27:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:33.040 21:27:11 -- setup/devices.sh@63 -- # found=1 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:33.040 21:27:11 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:33.040 21:27:11 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:33.040 21:27:11 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:33.040 21:27:11 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:33.040 21:27:11 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:33.040 21:27:11 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:33.040 21:27:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:33.040 21:27:11 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:33.040 21:27:11 -- setup/devices.sh@50 -- # local mount_point= 00:04:33.040 21:27:11 -- setup/devices.sh@51 -- # local test_file= 00:04:33.040 21:27:11 -- setup/devices.sh@53 -- # local found=0 00:04:33.040 21:27:11 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:33.040 21:27:11 -- setup/devices.sh@59 -- # local pci status 00:04:33.040 21:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.040 21:27:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:33.040 21:27:11 -- setup/devices.sh@47 -- # setup output config 00:04:33.040 21:27:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.040 21:27:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:36.327 21:27:14 -- setup/devices.sh@63 -- # found=1 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.327 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.327 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.328 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.328 21:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:36.328 21:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.328 21:27:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:36.328 21:27:14 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:36.328 21:27:14 -- setup/devices.sh@68 -- # return 0 00:04:36.328 21:27:14 -- setup/devices.sh@187 -- # cleanup_dm 00:04:36.328 21:27:14 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.328 21:27:14 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:36.328 21:27:14 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:36.328 21:27:14 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:36.328 21:27:14 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:36.328 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:36.328 21:27:14 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:36.328 21:27:14 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:36.328 00:04:36.328 real 0m9.710s 00:04:36.328 user 0m2.365s 00:04:36.328 sys 0m4.411s 00:04:36.328 21:27:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.328 21:27:14 -- common/autotest_common.sh@10 -- # set +x 00:04:36.328 ************************************ 00:04:36.328 END TEST dm_mount 00:04:36.328 ************************************ 00:04:36.328 21:27:14 -- setup/devices.sh@1 -- # cleanup 00:04:36.328 21:27:14 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:36.328 21:27:14 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:36.328 21:27:14 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:36.328 21:27:14 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:36.328 21:27:14 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:36.328 21:27:14 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:36.586 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:36.587 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:36.587 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:36.587 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:36.587 21:27:15 -- setup/devices.sh@12 -- # cleanup_dm 00:04:36.587 21:27:15 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.587 21:27:15 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:36.587 21:27:15 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:36.587 21:27:15 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:36.587 21:27:15 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:36.587 21:27:15 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:36.587 00:04:36.587 real 0m26.365s 00:04:36.587 user 0m7.272s 00:04:36.587 sys 0m13.949s 00:04:36.587 21:27:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.587 21:27:15 -- common/autotest_common.sh@10 -- # set +x 00:04:36.587 ************************************ 00:04:36.587 END TEST devices 00:04:36.587 ************************************ 00:04:36.587 00:04:36.587 real 1m30.242s 00:04:36.587 user 0m27.347s 00:04:36.587 sys 0m51.355s 00:04:36.587 21:27:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.587 21:27:15 -- common/autotest_common.sh@10 -- # set +x 00:04:36.587 ************************************ 00:04:36.587 END TEST setup.sh 00:04:36.587 ************************************ 00:04:36.587 21:27:15 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:39.883 Hugepages 00:04:39.883 node hugesize free / total 00:04:39.883 node0 1048576kB 0 / 0 00:04:39.883 node0 2048kB 2048 / 2048 00:04:39.883 node1 1048576kB 0 / 0 00:04:39.883 node1 2048kB 0 / 0 00:04:39.883 00:04:39.883 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:39.883 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:39.883 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:39.884 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:39.884 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:39.884 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:39.884 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:39.884 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:39.884 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:39.884 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:39.884 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:39.884 21:27:18 -- spdk/autotest.sh@141 -- # uname -s 00:04:39.884 21:27:18 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:04:39.884 21:27:18 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:04:39.884 21:27:18 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:43.177 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:43.177 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:45.084 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:45.084 21:27:23 -- common/autotest_common.sh@1517 -- # sleep 1 00:04:46.023 21:27:24 -- common/autotest_common.sh@1518 -- # bdfs=() 00:04:46.023 21:27:24 -- common/autotest_common.sh@1518 -- # local bdfs 00:04:46.023 21:27:24 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:04:46.023 21:27:24 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:04:46.023 21:27:24 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:46.023 21:27:24 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:46.023 21:27:24 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:46.023 21:27:24 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:46.023 21:27:24 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:46.023 21:27:24 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:04:46.023 21:27:24 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:04:46.023 21:27:24 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:49.312 Waiting for block devices as requested 00:04:49.312 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:49.312 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:49.312 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:49.312 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:49.312 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:49.572 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:49.572 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:49.572 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:49.831 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:49.831 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:49.831 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:50.090 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:50.090 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:50.090 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:50.349 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:50.349 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:50.349 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:50.609 21:27:29 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:04:50.609 21:27:29 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:04:50.609 21:27:29 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:50.609 21:27:29 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:04:50.609 21:27:29 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1530 -- # grep oacs 00:04:50.609 21:27:29 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:04:50.609 21:27:29 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:04:50.609 21:27:29 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:04:50.609 21:27:29 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:04:50.609 21:27:29 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:04:50.609 21:27:29 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:04:50.609 21:27:29 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:04:50.609 21:27:29 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:04:50.609 21:27:29 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:04:50.609 21:27:29 -- common/autotest_common.sh@1542 -- # continue 00:04:50.609 21:27:29 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:04:50.609 21:27:29 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:50.609 21:27:29 -- common/autotest_common.sh@10 -- # set +x 00:04:50.609 21:27:29 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:04:50.609 21:27:29 -- common/autotest_common.sh@712 -- # xtrace_disable 00:04:50.609 21:27:29 -- common/autotest_common.sh@10 -- # set +x 00:04:50.609 21:27:29 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:53.900 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:53.900 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:55.279 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:55.279 21:27:34 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:04:55.279 21:27:34 -- common/autotest_common.sh@718 -- # xtrace_disable 00:04:55.279 21:27:34 -- common/autotest_common.sh@10 -- # set +x 00:04:55.538 21:27:34 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:04:55.538 21:27:34 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:04:55.538 21:27:34 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:04:55.538 21:27:34 -- common/autotest_common.sh@1562 -- # bdfs=() 00:04:55.538 21:27:34 -- common/autotest_common.sh@1562 -- # local bdfs 00:04:55.538 21:27:34 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:04:55.538 21:27:34 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:55.538 21:27:34 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:55.538 21:27:34 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:55.538 21:27:34 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:55.538 21:27:34 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:55.538 21:27:34 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:04:55.538 21:27:34 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:04:55.538 21:27:34 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:04:55.538 21:27:34 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:04:55.538 21:27:34 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:04:55.538 21:27:34 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:55.538 21:27:34 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:04:55.538 21:27:34 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:04:55.538 21:27:34 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:04:55.538 21:27:34 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=3552071 00:04:55.538 21:27:34 -- common/autotest_common.sh@1583 -- # waitforlisten 3552071 00:04:55.538 21:27:34 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:55.538 21:27:34 -- common/autotest_common.sh@819 -- # '[' -z 3552071 ']' 00:04:55.538 21:27:34 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:55.538 21:27:34 -- common/autotest_common.sh@824 -- # local max_retries=100 00:04:55.538 21:27:34 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:55.539 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:55.539 21:27:34 -- common/autotest_common.sh@828 -- # xtrace_disable 00:04:55.539 21:27:34 -- common/autotest_common.sh@10 -- # set +x 00:04:55.539 [2024-07-12 21:27:34.223632] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:04:55.539 [2024-07-12 21:27:34.223699] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3552071 ] 00:04:55.539 EAL: No free 2048 kB hugepages reported on node 1 00:04:55.539 [2024-07-12 21:27:34.292604] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:55.798 [2024-07-12 21:27:34.372321] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:55.798 [2024-07-12 21:27:34.372428] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:56.366 21:27:35 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:04:56.366 21:27:35 -- common/autotest_common.sh@852 -- # return 0 00:04:56.366 21:27:35 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:04:56.366 21:27:35 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:04:56.366 21:27:35 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:04:59.654 nvme0n1 00:04:59.654 21:27:38 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:59.654 [2024-07-12 21:27:38.180069] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:04:59.654 request: 00:04:59.654 { 00:04:59.654 "nvme_ctrlr_name": "nvme0", 00:04:59.654 "password": "test", 00:04:59.654 "method": "bdev_nvme_opal_revert", 00:04:59.654 "req_id": 1 00:04:59.654 } 00:04:59.654 Got JSON-RPC error response 00:04:59.654 response: 00:04:59.654 { 00:04:59.654 "code": -32602, 00:04:59.654 "message": "Invalid parameters" 00:04:59.654 } 00:04:59.654 21:27:38 -- common/autotest_common.sh@1589 -- # true 00:04:59.654 21:27:38 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:04:59.654 21:27:38 -- common/autotest_common.sh@1593 -- # killprocess 3552071 00:04:59.654 21:27:38 -- common/autotest_common.sh@926 -- # '[' -z 3552071 ']' 00:04:59.654 21:27:38 -- common/autotest_common.sh@930 -- # kill -0 3552071 00:04:59.654 21:27:38 -- common/autotest_common.sh@931 -- # uname 00:04:59.654 21:27:38 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:04:59.654 21:27:38 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3552071 00:04:59.654 21:27:38 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:04:59.654 21:27:38 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:04:59.654 21:27:38 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3552071' 00:04:59.654 killing process with pid 3552071 00:04:59.654 21:27:38 -- common/autotest_common.sh@945 -- # kill 3552071 00:04:59.654 21:27:38 -- common/autotest_common.sh@950 -- # wait 3552071 00:05:02.190 21:27:40 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:02.190 21:27:40 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:02.190 21:27:40 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:02.190 21:27:40 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:02.190 21:27:40 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:02.190 21:27:40 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:02.190 21:27:40 -- common/autotest_common.sh@10 -- # set +x 00:05:02.190 21:27:40 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:02.190 21:27:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:02.190 21:27:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:02.190 21:27:40 -- common/autotest_common.sh@10 -- # set +x 00:05:02.190 ************************************ 00:05:02.190 START TEST env 00:05:02.190 ************************************ 00:05:02.190 21:27:40 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:02.190 * Looking for test storage... 00:05:02.190 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:02.190 21:27:40 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:02.190 21:27:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:02.190 21:27:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:02.190 21:27:40 -- common/autotest_common.sh@10 -- # set +x 00:05:02.190 ************************************ 00:05:02.190 START TEST env_memory 00:05:02.190 ************************************ 00:05:02.190 21:27:40 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:02.190 00:05:02.190 00:05:02.190 CUnit - A unit testing framework for C - Version 2.1-3 00:05:02.190 http://cunit.sourceforge.net/ 00:05:02.190 00:05:02.190 00:05:02.190 Suite: memory 00:05:02.190 Test: alloc and free memory map ...[2024-07-12 21:27:40.574009] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:02.190 passed 00:05:02.190 Test: mem map translation ...[2024-07-12 21:27:40.587751] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:02.190 [2024-07-12 21:27:40.587767] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:02.190 [2024-07-12 21:27:40.587797] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:02.190 [2024-07-12 21:27:40.587806] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:02.190 passed 00:05:02.190 Test: mem map registration ...[2024-07-12 21:27:40.609698] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:02.190 [2024-07-12 21:27:40.609717] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:02.190 passed 00:05:02.190 Test: mem map adjacent registrations ...passed 00:05:02.190 00:05:02.190 Run Summary: Type Total Ran Passed Failed Inactive 00:05:02.190 suites 1 1 n/a 0 0 00:05:02.190 tests 4 4 4 0 0 00:05:02.190 asserts 152 152 152 0 n/a 00:05:02.190 00:05:02.190 Elapsed time = 0.090 seconds 00:05:02.190 00:05:02.190 real 0m0.104s 00:05:02.190 user 0m0.094s 00:05:02.190 sys 0m0.009s 00:05:02.190 21:27:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:02.190 21:27:40 -- common/autotest_common.sh@10 -- # set +x 00:05:02.190 ************************************ 00:05:02.190 END TEST env_memory 00:05:02.190 ************************************ 00:05:02.190 21:27:40 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:02.190 21:27:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:02.190 21:27:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:02.190 21:27:40 -- common/autotest_common.sh@10 -- # set +x 00:05:02.190 ************************************ 00:05:02.190 START TEST env_vtophys 00:05:02.190 ************************************ 00:05:02.190 21:27:40 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:02.190 EAL: lib.eal log level changed from notice to debug 00:05:02.190 EAL: Detected lcore 0 as core 0 on socket 0 00:05:02.190 EAL: Detected lcore 1 as core 1 on socket 0 00:05:02.190 EAL: Detected lcore 2 as core 2 on socket 0 00:05:02.190 EAL: Detected lcore 3 as core 3 on socket 0 00:05:02.190 EAL: Detected lcore 4 as core 4 on socket 0 00:05:02.190 EAL: Detected lcore 5 as core 5 on socket 0 00:05:02.190 EAL: Detected lcore 6 as core 6 on socket 0 00:05:02.190 EAL: Detected lcore 7 as core 8 on socket 0 00:05:02.190 EAL: Detected lcore 8 as core 9 on socket 0 00:05:02.190 EAL: Detected lcore 9 as core 10 on socket 0 00:05:02.190 EAL: Detected lcore 10 as core 11 on socket 0 00:05:02.190 EAL: Detected lcore 11 as core 12 on socket 0 00:05:02.190 EAL: Detected lcore 12 as core 13 on socket 0 00:05:02.190 EAL: Detected lcore 13 as core 14 on socket 0 00:05:02.190 EAL: Detected lcore 14 as core 16 on socket 0 00:05:02.190 EAL: Detected lcore 15 as core 17 on socket 0 00:05:02.190 EAL: Detected lcore 16 as core 18 on socket 0 00:05:02.190 EAL: Detected lcore 17 as core 19 on socket 0 00:05:02.190 EAL: Detected lcore 18 as core 20 on socket 0 00:05:02.190 EAL: Detected lcore 19 as core 21 on socket 0 00:05:02.190 EAL: Detected lcore 20 as core 22 on socket 0 00:05:02.190 EAL: Detected lcore 21 as core 24 on socket 0 00:05:02.190 EAL: Detected lcore 22 as core 25 on socket 0 00:05:02.190 EAL: Detected lcore 23 as core 26 on socket 0 00:05:02.190 EAL: Detected lcore 24 as core 27 on socket 0 00:05:02.190 EAL: Detected lcore 25 as core 28 on socket 0 00:05:02.190 EAL: Detected lcore 26 as core 29 on socket 0 00:05:02.190 EAL: Detected lcore 27 as core 30 on socket 0 00:05:02.190 EAL: Detected lcore 28 as core 0 on socket 1 00:05:02.190 EAL: Detected lcore 29 as core 1 on socket 1 00:05:02.190 EAL: Detected lcore 30 as core 2 on socket 1 00:05:02.190 EAL: Detected lcore 31 as core 3 on socket 1 00:05:02.190 EAL: Detected lcore 32 as core 4 on socket 1 00:05:02.190 EAL: Detected lcore 33 as core 5 on socket 1 00:05:02.190 EAL: Detected lcore 34 as core 6 on socket 1 00:05:02.190 EAL: Detected lcore 35 as core 8 on socket 1 00:05:02.190 EAL: Detected lcore 36 as core 9 on socket 1 00:05:02.190 EAL: Detected lcore 37 as core 10 on socket 1 00:05:02.190 EAL: Detected lcore 38 as core 11 on socket 1 00:05:02.190 EAL: Detected lcore 39 as core 12 on socket 1 00:05:02.190 EAL: Detected lcore 40 as core 13 on socket 1 00:05:02.190 EAL: Detected lcore 41 as core 14 on socket 1 00:05:02.190 EAL: Detected lcore 42 as core 16 on socket 1 00:05:02.190 EAL: Detected lcore 43 as core 17 on socket 1 00:05:02.190 EAL: Detected lcore 44 as core 18 on socket 1 00:05:02.190 EAL: Detected lcore 45 as core 19 on socket 1 00:05:02.190 EAL: Detected lcore 46 as core 20 on socket 1 00:05:02.190 EAL: Detected lcore 47 as core 21 on socket 1 00:05:02.190 EAL: Detected lcore 48 as core 22 on socket 1 00:05:02.190 EAL: Detected lcore 49 as core 24 on socket 1 00:05:02.190 EAL: Detected lcore 50 as core 25 on socket 1 00:05:02.190 EAL: Detected lcore 51 as core 26 on socket 1 00:05:02.190 EAL: Detected lcore 52 as core 27 on socket 1 00:05:02.191 EAL: Detected lcore 53 as core 28 on socket 1 00:05:02.191 EAL: Detected lcore 54 as core 29 on socket 1 00:05:02.191 EAL: Detected lcore 55 as core 30 on socket 1 00:05:02.191 EAL: Detected lcore 56 as core 0 on socket 0 00:05:02.191 EAL: Detected lcore 57 as core 1 on socket 0 00:05:02.191 EAL: Detected lcore 58 as core 2 on socket 0 00:05:02.191 EAL: Detected lcore 59 as core 3 on socket 0 00:05:02.191 EAL: Detected lcore 60 as core 4 on socket 0 00:05:02.191 EAL: Detected lcore 61 as core 5 on socket 0 00:05:02.191 EAL: Detected lcore 62 as core 6 on socket 0 00:05:02.191 EAL: Detected lcore 63 as core 8 on socket 0 00:05:02.191 EAL: Detected lcore 64 as core 9 on socket 0 00:05:02.191 EAL: Detected lcore 65 as core 10 on socket 0 00:05:02.191 EAL: Detected lcore 66 as core 11 on socket 0 00:05:02.191 EAL: Detected lcore 67 as core 12 on socket 0 00:05:02.191 EAL: Detected lcore 68 as core 13 on socket 0 00:05:02.191 EAL: Detected lcore 69 as core 14 on socket 0 00:05:02.191 EAL: Detected lcore 70 as core 16 on socket 0 00:05:02.191 EAL: Detected lcore 71 as core 17 on socket 0 00:05:02.191 EAL: Detected lcore 72 as core 18 on socket 0 00:05:02.191 EAL: Detected lcore 73 as core 19 on socket 0 00:05:02.191 EAL: Detected lcore 74 as core 20 on socket 0 00:05:02.191 EAL: Detected lcore 75 as core 21 on socket 0 00:05:02.191 EAL: Detected lcore 76 as core 22 on socket 0 00:05:02.191 EAL: Detected lcore 77 as core 24 on socket 0 00:05:02.191 EAL: Detected lcore 78 as core 25 on socket 0 00:05:02.191 EAL: Detected lcore 79 as core 26 on socket 0 00:05:02.191 EAL: Detected lcore 80 as core 27 on socket 0 00:05:02.191 EAL: Detected lcore 81 as core 28 on socket 0 00:05:02.191 EAL: Detected lcore 82 as core 29 on socket 0 00:05:02.191 EAL: Detected lcore 83 as core 30 on socket 0 00:05:02.191 EAL: Detected lcore 84 as core 0 on socket 1 00:05:02.191 EAL: Detected lcore 85 as core 1 on socket 1 00:05:02.191 EAL: Detected lcore 86 as core 2 on socket 1 00:05:02.191 EAL: Detected lcore 87 as core 3 on socket 1 00:05:02.191 EAL: Detected lcore 88 as core 4 on socket 1 00:05:02.191 EAL: Detected lcore 89 as core 5 on socket 1 00:05:02.191 EAL: Detected lcore 90 as core 6 on socket 1 00:05:02.191 EAL: Detected lcore 91 as core 8 on socket 1 00:05:02.191 EAL: Detected lcore 92 as core 9 on socket 1 00:05:02.191 EAL: Detected lcore 93 as core 10 on socket 1 00:05:02.191 EAL: Detected lcore 94 as core 11 on socket 1 00:05:02.191 EAL: Detected lcore 95 as core 12 on socket 1 00:05:02.191 EAL: Detected lcore 96 as core 13 on socket 1 00:05:02.191 EAL: Detected lcore 97 as core 14 on socket 1 00:05:02.191 EAL: Detected lcore 98 as core 16 on socket 1 00:05:02.191 EAL: Detected lcore 99 as core 17 on socket 1 00:05:02.191 EAL: Detected lcore 100 as core 18 on socket 1 00:05:02.191 EAL: Detected lcore 101 as core 19 on socket 1 00:05:02.191 EAL: Detected lcore 102 as core 20 on socket 1 00:05:02.191 EAL: Detected lcore 103 as core 21 on socket 1 00:05:02.191 EAL: Detected lcore 104 as core 22 on socket 1 00:05:02.191 EAL: Detected lcore 105 as core 24 on socket 1 00:05:02.191 EAL: Detected lcore 106 as core 25 on socket 1 00:05:02.191 EAL: Detected lcore 107 as core 26 on socket 1 00:05:02.191 EAL: Detected lcore 108 as core 27 on socket 1 00:05:02.191 EAL: Detected lcore 109 as core 28 on socket 1 00:05:02.191 EAL: Detected lcore 110 as core 29 on socket 1 00:05:02.191 EAL: Detected lcore 111 as core 30 on socket 1 00:05:02.191 EAL: Maximum logical cores by configuration: 128 00:05:02.191 EAL: Detected CPU lcores: 112 00:05:02.191 EAL: Detected NUMA nodes: 2 00:05:02.191 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:02.191 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:02.191 EAL: Checking presence of .so 'librte_eal.so' 00:05:02.191 EAL: Detected static linkage of DPDK 00:05:02.191 EAL: No shared files mode enabled, IPC will be disabled 00:05:02.191 EAL: Bus pci wants IOVA as 'DC' 00:05:02.191 EAL: Buses did not request a specific IOVA mode. 00:05:02.191 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:02.191 EAL: Selected IOVA mode 'VA' 00:05:02.191 EAL: No free 2048 kB hugepages reported on node 1 00:05:02.191 EAL: Probing VFIO support... 00:05:02.191 EAL: IOMMU type 1 (Type 1) is supported 00:05:02.191 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:02.191 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:02.191 EAL: VFIO support initialized 00:05:02.191 EAL: Ask a virtual area of 0x2e000 bytes 00:05:02.191 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:02.191 EAL: Setting up physically contiguous memory... 00:05:02.191 EAL: Setting maximum number of open files to 524288 00:05:02.191 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:02.191 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:02.191 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:02.191 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:02.191 EAL: Ask a virtual area of 0x61000 bytes 00:05:02.191 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:02.191 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:02.191 EAL: Ask a virtual area of 0x400000000 bytes 00:05:02.191 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:02.191 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:02.191 EAL: Hugepages will be freed exactly as allocated. 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: TSC frequency is ~2500000 KHz 00:05:02.191 EAL: Main lcore 0 is ready (tid=7f177551da00;cpuset=[0]) 00:05:02.191 EAL: Trying to obtain current memory policy. 00:05:02.191 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.191 EAL: Restoring previous memory policy: 0 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was expanded by 2MB 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Mem event callback 'spdk:(nil)' registered 00:05:02.191 00:05:02.191 00:05:02.191 CUnit - A unit testing framework for C - Version 2.1-3 00:05:02.191 http://cunit.sourceforge.net/ 00:05:02.191 00:05:02.191 00:05:02.191 Suite: components_suite 00:05:02.191 Test: vtophys_malloc_test ...passed 00:05:02.191 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:02.191 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.191 EAL: Restoring previous memory policy: 4 00:05:02.191 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was expanded by 4MB 00:05:02.191 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was shrunk by 4MB 00:05:02.191 EAL: Trying to obtain current memory policy. 00:05:02.191 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.191 EAL: Restoring previous memory policy: 4 00:05:02.191 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was expanded by 6MB 00:05:02.191 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was shrunk by 6MB 00:05:02.191 EAL: Trying to obtain current memory policy. 00:05:02.191 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.191 EAL: Restoring previous memory policy: 4 00:05:02.191 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was expanded by 10MB 00:05:02.191 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was shrunk by 10MB 00:05:02.191 EAL: Trying to obtain current memory policy. 00:05:02.191 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.191 EAL: Restoring previous memory policy: 4 00:05:02.191 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.191 EAL: request: mp_malloc_sync 00:05:02.191 EAL: No shared files mode enabled, IPC is disabled 00:05:02.191 EAL: Heap on socket 0 was expanded by 18MB 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was shrunk by 18MB 00:05:02.192 EAL: Trying to obtain current memory policy. 00:05:02.192 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.192 EAL: Restoring previous memory policy: 4 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was expanded by 34MB 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was shrunk by 34MB 00:05:02.192 EAL: Trying to obtain current memory policy. 00:05:02.192 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.192 EAL: Restoring previous memory policy: 4 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was expanded by 66MB 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was shrunk by 66MB 00:05:02.192 EAL: Trying to obtain current memory policy. 00:05:02.192 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.192 EAL: Restoring previous memory policy: 4 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was expanded by 130MB 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was shrunk by 130MB 00:05:02.192 EAL: Trying to obtain current memory policy. 00:05:02.192 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.192 EAL: Restoring previous memory policy: 4 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.192 EAL: request: mp_malloc_sync 00:05:02.192 EAL: No shared files mode enabled, IPC is disabled 00:05:02.192 EAL: Heap on socket 0 was expanded by 258MB 00:05:02.192 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.451 EAL: request: mp_malloc_sync 00:05:02.451 EAL: No shared files mode enabled, IPC is disabled 00:05:02.451 EAL: Heap on socket 0 was shrunk by 258MB 00:05:02.451 EAL: Trying to obtain current memory policy. 00:05:02.451 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.451 EAL: Restoring previous memory policy: 4 00:05:02.451 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.451 EAL: request: mp_malloc_sync 00:05:02.451 EAL: No shared files mode enabled, IPC is disabled 00:05:02.451 EAL: Heap on socket 0 was expanded by 514MB 00:05:02.451 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.710 EAL: request: mp_malloc_sync 00:05:02.710 EAL: No shared files mode enabled, IPC is disabled 00:05:02.710 EAL: Heap on socket 0 was shrunk by 514MB 00:05:02.710 EAL: Trying to obtain current memory policy. 00:05:02.710 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:02.710 EAL: Restoring previous memory policy: 4 00:05:02.710 EAL: Calling mem event callback 'spdk:(nil)' 00:05:02.710 EAL: request: mp_malloc_sync 00:05:02.710 EAL: No shared files mode enabled, IPC is disabled 00:05:02.710 EAL: Heap on socket 0 was expanded by 1026MB 00:05:02.968 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.227 EAL: request: mp_malloc_sync 00:05:03.227 EAL: No shared files mode enabled, IPC is disabled 00:05:03.227 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:03.227 passed 00:05:03.227 00:05:03.227 Run Summary: Type Total Ran Passed Failed Inactive 00:05:03.227 suites 1 1 n/a 0 0 00:05:03.227 tests 2 2 2 0 0 00:05:03.227 asserts 497 497 497 0 n/a 00:05:03.227 00:05:03.227 Elapsed time = 0.958 seconds 00:05:03.227 EAL: Calling mem event callback 'spdk:(nil)' 00:05:03.227 EAL: request: mp_malloc_sync 00:05:03.227 EAL: No shared files mode enabled, IPC is disabled 00:05:03.227 EAL: Heap on socket 0 was shrunk by 2MB 00:05:03.227 EAL: No shared files mode enabled, IPC is disabled 00:05:03.227 EAL: No shared files mode enabled, IPC is disabled 00:05:03.227 EAL: No shared files mode enabled, IPC is disabled 00:05:03.227 00:05:03.227 real 0m1.084s 00:05:03.227 user 0m0.626s 00:05:03.227 sys 0m0.424s 00:05:03.227 21:27:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.227 21:27:41 -- common/autotest_common.sh@10 -- # set +x 00:05:03.227 ************************************ 00:05:03.227 END TEST env_vtophys 00:05:03.227 ************************************ 00:05:03.227 21:27:41 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:03.227 21:27:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:03.227 21:27:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:03.227 21:27:41 -- common/autotest_common.sh@10 -- # set +x 00:05:03.227 ************************************ 00:05:03.227 START TEST env_pci 00:05:03.227 ************************************ 00:05:03.227 21:27:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:03.227 00:05:03.227 00:05:03.227 CUnit - A unit testing framework for C - Version 2.1-3 00:05:03.227 http://cunit.sourceforge.net/ 00:05:03.227 00:05:03.227 00:05:03.227 Suite: pci 00:05:03.227 Test: pci_hook ...[2024-07-12 21:27:41.822024] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3553455 has claimed it 00:05:03.227 EAL: Cannot find device (10000:00:01.0) 00:05:03.227 EAL: Failed to attach device on primary process 00:05:03.227 passed 00:05:03.227 00:05:03.227 Run Summary: Type Total Ran Passed Failed Inactive 00:05:03.227 suites 1 1 n/a 0 0 00:05:03.227 tests 1 1 1 0 0 00:05:03.227 asserts 25 25 25 0 n/a 00:05:03.227 00:05:03.227 Elapsed time = 0.033 seconds 00:05:03.227 00:05:03.227 real 0m0.051s 00:05:03.227 user 0m0.019s 00:05:03.227 sys 0m0.032s 00:05:03.227 21:27:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.227 21:27:41 -- common/autotest_common.sh@10 -- # set +x 00:05:03.227 ************************************ 00:05:03.227 END TEST env_pci 00:05:03.227 ************************************ 00:05:03.227 21:27:41 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:03.227 21:27:41 -- env/env.sh@15 -- # uname 00:05:03.227 21:27:41 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:03.227 21:27:41 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:03.227 21:27:41 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:03.227 21:27:41 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:03.227 21:27:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:03.227 21:27:41 -- common/autotest_common.sh@10 -- # set +x 00:05:03.227 ************************************ 00:05:03.227 START TEST env_dpdk_post_init 00:05:03.227 ************************************ 00:05:03.228 21:27:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:03.228 EAL: Detected CPU lcores: 112 00:05:03.228 EAL: Detected NUMA nodes: 2 00:05:03.228 EAL: Detected static linkage of DPDK 00:05:03.228 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:03.228 EAL: Selected IOVA mode 'VA' 00:05:03.228 EAL: No free 2048 kB hugepages reported on node 1 00:05:03.228 EAL: VFIO support initialized 00:05:03.228 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:03.487 EAL: Using IOMMU type 1 (Type 1) 00:05:04.067 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:08.361 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:08.361 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:08.361 Starting DPDK initialization... 00:05:08.361 Starting SPDK post initialization... 00:05:08.361 SPDK NVMe probe 00:05:08.361 Attaching to 0000:d8:00.0 00:05:08.361 Attached to 0000:d8:00.0 00:05:08.361 Cleaning up... 00:05:08.361 00:05:08.361 real 0m4.745s 00:05:08.361 user 0m3.564s 00:05:08.361 sys 0m0.428s 00:05:08.361 21:27:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:08.361 21:27:46 -- common/autotest_common.sh@10 -- # set +x 00:05:08.361 ************************************ 00:05:08.361 END TEST env_dpdk_post_init 00:05:08.361 ************************************ 00:05:08.361 21:27:46 -- env/env.sh@26 -- # uname 00:05:08.361 21:27:46 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:08.361 21:27:46 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:08.361 21:27:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:08.361 21:27:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:08.361 21:27:46 -- common/autotest_common.sh@10 -- # set +x 00:05:08.361 ************************************ 00:05:08.361 START TEST env_mem_callbacks 00:05:08.361 ************************************ 00:05:08.361 21:27:46 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:08.361 EAL: Detected CPU lcores: 112 00:05:08.361 EAL: Detected NUMA nodes: 2 00:05:08.361 EAL: Detected static linkage of DPDK 00:05:08.361 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:08.361 EAL: Selected IOVA mode 'VA' 00:05:08.361 EAL: No free 2048 kB hugepages reported on node 1 00:05:08.361 EAL: VFIO support initialized 00:05:08.361 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:08.361 00:05:08.361 00:05:08.361 CUnit - A unit testing framework for C - Version 2.1-3 00:05:08.361 http://cunit.sourceforge.net/ 00:05:08.361 00:05:08.362 00:05:08.362 Suite: memory 00:05:08.362 Test: test ... 00:05:08.362 register 0x200000200000 2097152 00:05:08.362 malloc 3145728 00:05:08.362 register 0x200000400000 4194304 00:05:08.362 buf 0x200000500000 len 3145728 PASSED 00:05:08.362 malloc 64 00:05:08.362 buf 0x2000004fff40 len 64 PASSED 00:05:08.362 malloc 4194304 00:05:08.362 register 0x200000800000 6291456 00:05:08.362 buf 0x200000a00000 len 4194304 PASSED 00:05:08.362 free 0x200000500000 3145728 00:05:08.362 free 0x2000004fff40 64 00:05:08.362 unregister 0x200000400000 4194304 PASSED 00:05:08.362 free 0x200000a00000 4194304 00:05:08.362 unregister 0x200000800000 6291456 PASSED 00:05:08.362 malloc 8388608 00:05:08.362 register 0x200000400000 10485760 00:05:08.362 buf 0x200000600000 len 8388608 PASSED 00:05:08.362 free 0x200000600000 8388608 00:05:08.362 unregister 0x200000400000 10485760 PASSED 00:05:08.362 passed 00:05:08.362 00:05:08.362 Run Summary: Type Total Ran Passed Failed Inactive 00:05:08.362 suites 1 1 n/a 0 0 00:05:08.362 tests 1 1 1 0 0 00:05:08.362 asserts 15 15 15 0 n/a 00:05:08.362 00:05:08.362 Elapsed time = 0.005 seconds 00:05:08.362 00:05:08.362 real 0m0.045s 00:05:08.362 user 0m0.007s 00:05:08.362 sys 0m0.039s 00:05:08.362 21:27:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:08.362 21:27:46 -- common/autotest_common.sh@10 -- # set +x 00:05:08.362 ************************************ 00:05:08.362 END TEST env_mem_callbacks 00:05:08.362 ************************************ 00:05:08.362 00:05:08.362 real 0m6.343s 00:05:08.362 user 0m4.423s 00:05:08.362 sys 0m1.179s 00:05:08.362 21:27:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:08.362 21:27:46 -- common/autotest_common.sh@10 -- # set +x 00:05:08.362 ************************************ 00:05:08.362 END TEST env 00:05:08.362 ************************************ 00:05:08.362 21:27:46 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:08.362 21:27:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:08.362 21:27:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:08.362 21:27:46 -- common/autotest_common.sh@10 -- # set +x 00:05:08.362 ************************************ 00:05:08.362 START TEST rpc 00:05:08.362 ************************************ 00:05:08.362 21:27:46 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:08.362 * Looking for test storage... 00:05:08.362 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:08.362 21:27:46 -- rpc/rpc.sh@65 -- # spdk_pid=3554566 00:05:08.362 21:27:46 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:08.362 21:27:46 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:08.362 21:27:46 -- rpc/rpc.sh@67 -- # waitforlisten 3554566 00:05:08.362 21:27:46 -- common/autotest_common.sh@819 -- # '[' -z 3554566 ']' 00:05:08.362 21:27:46 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:08.362 21:27:46 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:08.362 21:27:46 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:08.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:08.362 21:27:46 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:08.362 21:27:46 -- common/autotest_common.sh@10 -- # set +x 00:05:08.362 [2024-07-12 21:27:46.954902] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:08.362 [2024-07-12 21:27:46.954968] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3554566 ] 00:05:08.362 EAL: No free 2048 kB hugepages reported on node 1 00:05:08.362 [2024-07-12 21:27:47.022548] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:08.362 [2024-07-12 21:27:47.099314] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:08.362 [2024-07-12 21:27:47.099418] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:08.362 [2024-07-12 21:27:47.099429] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3554566' to capture a snapshot of events at runtime. 00:05:08.362 [2024-07-12 21:27:47.099438] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3554566 for offline analysis/debug. 00:05:08.362 [2024-07-12 21:27:47.099461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:09.300 21:27:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:09.300 21:27:47 -- common/autotest_common.sh@852 -- # return 0 00:05:09.300 21:27:47 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:09.300 21:27:47 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:09.300 21:27:47 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:09.300 21:27:47 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:09.300 21:27:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:09.300 21:27:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 ************************************ 00:05:09.300 START TEST rpc_integrity 00:05:09.300 ************************************ 00:05:09.300 21:27:47 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:09.300 21:27:47 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:47 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:09.300 21:27:47 -- rpc/rpc.sh@13 -- # jq length 00:05:09.300 21:27:47 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:09.300 21:27:47 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:47 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:09.300 21:27:47 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:47 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:09.300 { 00:05:09.300 "name": "Malloc0", 00:05:09.300 "aliases": [ 00:05:09.300 "959c01b2-1873-485d-b1ff-bb29079a7c4f" 00:05:09.300 ], 00:05:09.300 "product_name": "Malloc disk", 00:05:09.300 "block_size": 512, 00:05:09.300 "num_blocks": 16384, 00:05:09.300 "uuid": "959c01b2-1873-485d-b1ff-bb29079a7c4f", 00:05:09.300 "assigned_rate_limits": { 00:05:09.300 "rw_ios_per_sec": 0, 00:05:09.300 "rw_mbytes_per_sec": 0, 00:05:09.300 "r_mbytes_per_sec": 0, 00:05:09.300 "w_mbytes_per_sec": 0 00:05:09.300 }, 00:05:09.300 "claimed": false, 00:05:09.300 "zoned": false, 00:05:09.300 "supported_io_types": { 00:05:09.300 "read": true, 00:05:09.300 "write": true, 00:05:09.300 "unmap": true, 00:05:09.300 "write_zeroes": true, 00:05:09.300 "flush": true, 00:05:09.300 "reset": true, 00:05:09.300 "compare": false, 00:05:09.300 "compare_and_write": false, 00:05:09.300 "abort": true, 00:05:09.300 "nvme_admin": false, 00:05:09.300 "nvme_io": false 00:05:09.300 }, 00:05:09.300 "memory_domains": [ 00:05:09.300 { 00:05:09.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:09.300 "dma_device_type": 2 00:05:09.300 } 00:05:09.300 ], 00:05:09.300 "driver_specific": {} 00:05:09.300 } 00:05:09.300 ]' 00:05:09.300 21:27:47 -- rpc/rpc.sh@17 -- # jq length 00:05:09.300 21:27:47 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:09.300 21:27:47 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 [2024-07-12 21:27:47.895655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:09.300 [2024-07-12 21:27:47.895687] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:09.300 [2024-07-12 21:27:47.895703] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x50d65e0 00:05:09.300 [2024-07-12 21:27:47.895717] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:09.300 [2024-07-12 21:27:47.896562] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:09.300 [2024-07-12 21:27:47.896585] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:09.300 Passthru0 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:47 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:47 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:09.300 { 00:05:09.300 "name": "Malloc0", 00:05:09.300 "aliases": [ 00:05:09.300 "959c01b2-1873-485d-b1ff-bb29079a7c4f" 00:05:09.300 ], 00:05:09.300 "product_name": "Malloc disk", 00:05:09.300 "block_size": 512, 00:05:09.300 "num_blocks": 16384, 00:05:09.300 "uuid": "959c01b2-1873-485d-b1ff-bb29079a7c4f", 00:05:09.300 "assigned_rate_limits": { 00:05:09.300 "rw_ios_per_sec": 0, 00:05:09.300 "rw_mbytes_per_sec": 0, 00:05:09.300 "r_mbytes_per_sec": 0, 00:05:09.300 "w_mbytes_per_sec": 0 00:05:09.300 }, 00:05:09.300 "claimed": true, 00:05:09.300 "claim_type": "exclusive_write", 00:05:09.300 "zoned": false, 00:05:09.300 "supported_io_types": { 00:05:09.300 "read": true, 00:05:09.300 "write": true, 00:05:09.300 "unmap": true, 00:05:09.300 "write_zeroes": true, 00:05:09.300 "flush": true, 00:05:09.300 "reset": true, 00:05:09.300 "compare": false, 00:05:09.300 "compare_and_write": false, 00:05:09.300 "abort": true, 00:05:09.300 "nvme_admin": false, 00:05:09.300 "nvme_io": false 00:05:09.300 }, 00:05:09.300 "memory_domains": [ 00:05:09.300 { 00:05:09.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:09.300 "dma_device_type": 2 00:05:09.300 } 00:05:09.300 ], 00:05:09.300 "driver_specific": {} 00:05:09.300 }, 00:05:09.300 { 00:05:09.300 "name": "Passthru0", 00:05:09.300 "aliases": [ 00:05:09.300 "1ab64efc-4a00-5efa-8293-5440f203e4bb" 00:05:09.300 ], 00:05:09.300 "product_name": "passthru", 00:05:09.300 "block_size": 512, 00:05:09.300 "num_blocks": 16384, 00:05:09.300 "uuid": "1ab64efc-4a00-5efa-8293-5440f203e4bb", 00:05:09.300 "assigned_rate_limits": { 00:05:09.300 "rw_ios_per_sec": 0, 00:05:09.300 "rw_mbytes_per_sec": 0, 00:05:09.300 "r_mbytes_per_sec": 0, 00:05:09.300 "w_mbytes_per_sec": 0 00:05:09.300 }, 00:05:09.300 "claimed": false, 00:05:09.300 "zoned": false, 00:05:09.300 "supported_io_types": { 00:05:09.300 "read": true, 00:05:09.300 "write": true, 00:05:09.300 "unmap": true, 00:05:09.300 "write_zeroes": true, 00:05:09.300 "flush": true, 00:05:09.300 "reset": true, 00:05:09.300 "compare": false, 00:05:09.300 "compare_and_write": false, 00:05:09.300 "abort": true, 00:05:09.300 "nvme_admin": false, 00:05:09.300 "nvme_io": false 00:05:09.300 }, 00:05:09.300 "memory_domains": [ 00:05:09.300 { 00:05:09.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:09.300 "dma_device_type": 2 00:05:09.300 } 00:05:09.300 ], 00:05:09.300 "driver_specific": { 00:05:09.300 "passthru": { 00:05:09.300 "name": "Passthru0", 00:05:09.300 "base_bdev_name": "Malloc0" 00:05:09.300 } 00:05:09.300 } 00:05:09.300 } 00:05:09.300 ]' 00:05:09.300 21:27:47 -- rpc/rpc.sh@21 -- # jq length 00:05:09.300 21:27:47 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:09.300 21:27:47 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:47 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:47 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:09.300 21:27:47 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.300 21:27:47 -- common/autotest_common.sh@10 -- # set +x 00:05:09.300 21:27:47 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.300 21:27:48 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:09.300 21:27:48 -- rpc/rpc.sh@26 -- # jq length 00:05:09.300 21:27:48 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:09.300 00:05:09.300 real 0m0.263s 00:05:09.301 user 0m0.163s 00:05:09.301 sys 0m0.040s 00:05:09.301 21:27:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.301 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.301 ************************************ 00:05:09.301 END TEST rpc_integrity 00:05:09.301 ************************************ 00:05:09.301 21:27:48 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:09.301 21:27:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:09.301 21:27:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:09.301 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.301 ************************************ 00:05:09.301 START TEST rpc_plugins 00:05:09.301 ************************************ 00:05:09.301 21:27:48 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:09.560 21:27:48 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:09.560 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.560 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.560 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.560 21:27:48 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:09.560 21:27:48 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:09.560 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.560 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.560 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.560 21:27:48 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:09.560 { 00:05:09.560 "name": "Malloc1", 00:05:09.560 "aliases": [ 00:05:09.560 "abde241e-d928-401c-9721-b405129821fd" 00:05:09.560 ], 00:05:09.560 "product_name": "Malloc disk", 00:05:09.560 "block_size": 4096, 00:05:09.560 "num_blocks": 256, 00:05:09.560 "uuid": "abde241e-d928-401c-9721-b405129821fd", 00:05:09.560 "assigned_rate_limits": { 00:05:09.560 "rw_ios_per_sec": 0, 00:05:09.560 "rw_mbytes_per_sec": 0, 00:05:09.560 "r_mbytes_per_sec": 0, 00:05:09.560 "w_mbytes_per_sec": 0 00:05:09.560 }, 00:05:09.560 "claimed": false, 00:05:09.560 "zoned": false, 00:05:09.560 "supported_io_types": { 00:05:09.560 "read": true, 00:05:09.560 "write": true, 00:05:09.560 "unmap": true, 00:05:09.560 "write_zeroes": true, 00:05:09.560 "flush": true, 00:05:09.560 "reset": true, 00:05:09.560 "compare": false, 00:05:09.560 "compare_and_write": false, 00:05:09.560 "abort": true, 00:05:09.560 "nvme_admin": false, 00:05:09.560 "nvme_io": false 00:05:09.560 }, 00:05:09.560 "memory_domains": [ 00:05:09.560 { 00:05:09.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:09.560 "dma_device_type": 2 00:05:09.560 } 00:05:09.560 ], 00:05:09.560 "driver_specific": {} 00:05:09.560 } 00:05:09.560 ]' 00:05:09.560 21:27:48 -- rpc/rpc.sh@32 -- # jq length 00:05:09.560 21:27:48 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:09.560 21:27:48 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:09.560 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.560 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.560 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.560 21:27:48 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:09.560 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.560 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.560 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.560 21:27:48 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:09.560 21:27:48 -- rpc/rpc.sh@36 -- # jq length 00:05:09.560 21:27:48 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:09.560 00:05:09.560 real 0m0.142s 00:05:09.560 user 0m0.080s 00:05:09.560 sys 0m0.028s 00:05:09.560 21:27:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.560 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.560 ************************************ 00:05:09.560 END TEST rpc_plugins 00:05:09.560 ************************************ 00:05:09.560 21:27:48 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:09.560 21:27:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:09.560 21:27:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:09.560 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.560 ************************************ 00:05:09.560 START TEST rpc_trace_cmd_test 00:05:09.560 ************************************ 00:05:09.560 21:27:48 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:09.560 21:27:48 -- rpc/rpc.sh@40 -- # local info 00:05:09.560 21:27:48 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:09.560 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.560 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.560 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.560 21:27:48 -- rpc/rpc.sh@42 -- # info='{ 00:05:09.560 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3554566", 00:05:09.560 "tpoint_group_mask": "0x8", 00:05:09.560 "iscsi_conn": { 00:05:09.560 "mask": "0x2", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "scsi": { 00:05:09.560 "mask": "0x4", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "bdev": { 00:05:09.560 "mask": "0x8", 00:05:09.560 "tpoint_mask": "0xffffffffffffffff" 00:05:09.560 }, 00:05:09.560 "nvmf_rdma": { 00:05:09.560 "mask": "0x10", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "nvmf_tcp": { 00:05:09.560 "mask": "0x20", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "ftl": { 00:05:09.560 "mask": "0x40", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "blobfs": { 00:05:09.560 "mask": "0x80", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "dsa": { 00:05:09.560 "mask": "0x200", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "thread": { 00:05:09.560 "mask": "0x400", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "nvme_pcie": { 00:05:09.560 "mask": "0x800", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "iaa": { 00:05:09.560 "mask": "0x1000", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "nvme_tcp": { 00:05:09.560 "mask": "0x2000", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 }, 00:05:09.560 "bdev_nvme": { 00:05:09.560 "mask": "0x4000", 00:05:09.560 "tpoint_mask": "0x0" 00:05:09.560 } 00:05:09.560 }' 00:05:09.560 21:27:48 -- rpc/rpc.sh@43 -- # jq length 00:05:09.560 21:27:48 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:09.560 21:27:48 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:09.820 21:27:48 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:09.820 21:27:48 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:09.820 21:27:48 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:09.820 21:27:48 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:09.820 21:27:48 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:09.820 21:27:48 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:09.820 21:27:48 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:09.820 00:05:09.820 real 0m0.221s 00:05:09.820 user 0m0.183s 00:05:09.820 sys 0m0.030s 00:05:09.820 21:27:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.820 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.820 ************************************ 00:05:09.820 END TEST rpc_trace_cmd_test 00:05:09.820 ************************************ 00:05:09.820 21:27:48 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:09.820 21:27:48 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:09.820 21:27:48 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:09.820 21:27:48 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:09.820 21:27:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:09.820 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.820 ************************************ 00:05:09.820 START TEST rpc_daemon_integrity 00:05:09.820 ************************************ 00:05:09.820 21:27:48 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:09.820 21:27:48 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:09.820 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.820 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.820 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.820 21:27:48 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:09.820 21:27:48 -- rpc/rpc.sh@13 -- # jq length 00:05:09.820 21:27:48 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:09.820 21:27:48 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:09.820 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.820 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:09.820 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:09.820 21:27:48 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:09.820 21:27:48 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:09.820 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:09.820 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:10.079 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:10.079 21:27:48 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:10.079 { 00:05:10.079 "name": "Malloc2", 00:05:10.079 "aliases": [ 00:05:10.079 "fd0eb6e0-a161-4b34-80ca-b5fcaac99b46" 00:05:10.079 ], 00:05:10.079 "product_name": "Malloc disk", 00:05:10.079 "block_size": 512, 00:05:10.079 "num_blocks": 16384, 00:05:10.079 "uuid": "fd0eb6e0-a161-4b34-80ca-b5fcaac99b46", 00:05:10.079 "assigned_rate_limits": { 00:05:10.079 "rw_ios_per_sec": 0, 00:05:10.079 "rw_mbytes_per_sec": 0, 00:05:10.079 "r_mbytes_per_sec": 0, 00:05:10.079 "w_mbytes_per_sec": 0 00:05:10.079 }, 00:05:10.079 "claimed": false, 00:05:10.079 "zoned": false, 00:05:10.079 "supported_io_types": { 00:05:10.079 "read": true, 00:05:10.079 "write": true, 00:05:10.079 "unmap": true, 00:05:10.079 "write_zeroes": true, 00:05:10.079 "flush": true, 00:05:10.079 "reset": true, 00:05:10.079 "compare": false, 00:05:10.079 "compare_and_write": false, 00:05:10.079 "abort": true, 00:05:10.079 "nvme_admin": false, 00:05:10.079 "nvme_io": false 00:05:10.079 }, 00:05:10.079 "memory_domains": [ 00:05:10.079 { 00:05:10.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:10.079 "dma_device_type": 2 00:05:10.079 } 00:05:10.079 ], 00:05:10.079 "driver_specific": {} 00:05:10.079 } 00:05:10.079 ]' 00:05:10.079 21:27:48 -- rpc/rpc.sh@17 -- # jq length 00:05:10.079 21:27:48 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:10.079 21:27:48 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:10.079 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:10.079 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:10.079 [2024-07-12 21:27:48.665645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:10.079 [2024-07-12 21:27:48.665675] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:10.079 [2024-07-12 21:27:48.665690] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4f3e5e0 00:05:10.079 [2024-07-12 21:27:48.665699] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:10.079 [2024-07-12 21:27:48.666406] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:10.079 [2024-07-12 21:27:48.666425] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:10.079 Passthru0 00:05:10.079 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:10.079 21:27:48 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:10.079 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:10.079 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:10.079 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:10.079 21:27:48 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:10.079 { 00:05:10.079 "name": "Malloc2", 00:05:10.079 "aliases": [ 00:05:10.079 "fd0eb6e0-a161-4b34-80ca-b5fcaac99b46" 00:05:10.079 ], 00:05:10.079 "product_name": "Malloc disk", 00:05:10.079 "block_size": 512, 00:05:10.079 "num_blocks": 16384, 00:05:10.079 "uuid": "fd0eb6e0-a161-4b34-80ca-b5fcaac99b46", 00:05:10.079 "assigned_rate_limits": { 00:05:10.079 "rw_ios_per_sec": 0, 00:05:10.079 "rw_mbytes_per_sec": 0, 00:05:10.079 "r_mbytes_per_sec": 0, 00:05:10.079 "w_mbytes_per_sec": 0 00:05:10.079 }, 00:05:10.079 "claimed": true, 00:05:10.079 "claim_type": "exclusive_write", 00:05:10.079 "zoned": false, 00:05:10.079 "supported_io_types": { 00:05:10.079 "read": true, 00:05:10.079 "write": true, 00:05:10.080 "unmap": true, 00:05:10.080 "write_zeroes": true, 00:05:10.080 "flush": true, 00:05:10.080 "reset": true, 00:05:10.080 "compare": false, 00:05:10.080 "compare_and_write": false, 00:05:10.080 "abort": true, 00:05:10.080 "nvme_admin": false, 00:05:10.080 "nvme_io": false 00:05:10.080 }, 00:05:10.080 "memory_domains": [ 00:05:10.080 { 00:05:10.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:10.080 "dma_device_type": 2 00:05:10.080 } 00:05:10.080 ], 00:05:10.080 "driver_specific": {} 00:05:10.080 }, 00:05:10.080 { 00:05:10.080 "name": "Passthru0", 00:05:10.080 "aliases": [ 00:05:10.080 "34b58df5-e00a-5afb-b86e-a53585e925c7" 00:05:10.080 ], 00:05:10.080 "product_name": "passthru", 00:05:10.080 "block_size": 512, 00:05:10.080 "num_blocks": 16384, 00:05:10.080 "uuid": "34b58df5-e00a-5afb-b86e-a53585e925c7", 00:05:10.080 "assigned_rate_limits": { 00:05:10.080 "rw_ios_per_sec": 0, 00:05:10.080 "rw_mbytes_per_sec": 0, 00:05:10.080 "r_mbytes_per_sec": 0, 00:05:10.080 "w_mbytes_per_sec": 0 00:05:10.080 }, 00:05:10.080 "claimed": false, 00:05:10.080 "zoned": false, 00:05:10.080 "supported_io_types": { 00:05:10.080 "read": true, 00:05:10.080 "write": true, 00:05:10.080 "unmap": true, 00:05:10.080 "write_zeroes": true, 00:05:10.080 "flush": true, 00:05:10.080 "reset": true, 00:05:10.080 "compare": false, 00:05:10.080 "compare_and_write": false, 00:05:10.080 "abort": true, 00:05:10.080 "nvme_admin": false, 00:05:10.080 "nvme_io": false 00:05:10.080 }, 00:05:10.080 "memory_domains": [ 00:05:10.080 { 00:05:10.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:10.080 "dma_device_type": 2 00:05:10.080 } 00:05:10.080 ], 00:05:10.080 "driver_specific": { 00:05:10.080 "passthru": { 00:05:10.080 "name": "Passthru0", 00:05:10.080 "base_bdev_name": "Malloc2" 00:05:10.080 } 00:05:10.080 } 00:05:10.080 } 00:05:10.080 ]' 00:05:10.080 21:27:48 -- rpc/rpc.sh@21 -- # jq length 00:05:10.080 21:27:48 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:10.080 21:27:48 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:10.080 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:10.080 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:10.080 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:10.080 21:27:48 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:10.080 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:10.080 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:10.080 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:10.080 21:27:48 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:10.080 21:27:48 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:10.080 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:10.080 21:27:48 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:10.080 21:27:48 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:10.080 21:27:48 -- rpc/rpc.sh@26 -- # jq length 00:05:10.080 21:27:48 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:10.080 00:05:10.080 real 0m0.269s 00:05:10.080 user 0m0.170s 00:05:10.080 sys 0m0.043s 00:05:10.080 21:27:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:10.080 21:27:48 -- common/autotest_common.sh@10 -- # set +x 00:05:10.080 ************************************ 00:05:10.080 END TEST rpc_daemon_integrity 00:05:10.080 ************************************ 00:05:10.080 21:27:48 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:10.080 21:27:48 -- rpc/rpc.sh@84 -- # killprocess 3554566 00:05:10.080 21:27:48 -- common/autotest_common.sh@926 -- # '[' -z 3554566 ']' 00:05:10.080 21:27:48 -- common/autotest_common.sh@930 -- # kill -0 3554566 00:05:10.080 21:27:48 -- common/autotest_common.sh@931 -- # uname 00:05:10.080 21:27:48 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:10.080 21:27:48 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3554566 00:05:10.339 21:27:48 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:10.339 21:27:48 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:10.339 21:27:48 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3554566' 00:05:10.339 killing process with pid 3554566 00:05:10.339 21:27:48 -- common/autotest_common.sh@945 -- # kill 3554566 00:05:10.339 21:27:48 -- common/autotest_common.sh@950 -- # wait 3554566 00:05:10.598 00:05:10.598 real 0m2.366s 00:05:10.598 user 0m2.986s 00:05:10.598 sys 0m0.709s 00:05:10.598 21:27:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:10.598 21:27:49 -- common/autotest_common.sh@10 -- # set +x 00:05:10.598 ************************************ 00:05:10.598 END TEST rpc 00:05:10.598 ************************************ 00:05:10.598 21:27:49 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:10.598 21:27:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:10.598 21:27:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:10.598 21:27:49 -- common/autotest_common.sh@10 -- # set +x 00:05:10.598 ************************************ 00:05:10.598 START TEST rpc_client 00:05:10.598 ************************************ 00:05:10.598 21:27:49 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:10.598 * Looking for test storage... 00:05:10.598 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:10.598 21:27:49 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:10.598 OK 00:05:10.598 21:27:49 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:10.598 00:05:10.598 real 0m0.094s 00:05:10.598 user 0m0.037s 00:05:10.598 sys 0m0.066s 00:05:10.598 21:27:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:10.598 21:27:49 -- common/autotest_common.sh@10 -- # set +x 00:05:10.598 ************************************ 00:05:10.598 END TEST rpc_client 00:05:10.598 ************************************ 00:05:10.858 21:27:49 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:10.858 21:27:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:10.858 21:27:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:10.858 21:27:49 -- common/autotest_common.sh@10 -- # set +x 00:05:10.858 ************************************ 00:05:10.858 START TEST json_config 00:05:10.858 ************************************ 00:05:10.858 21:27:49 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:10.858 21:27:49 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:10.858 21:27:49 -- nvmf/common.sh@7 -- # uname -s 00:05:10.858 21:27:49 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:10.858 21:27:49 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:10.858 21:27:49 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:10.858 21:27:49 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:10.858 21:27:49 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:10.858 21:27:49 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:10.858 21:27:49 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:10.858 21:27:49 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:10.858 21:27:49 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:10.858 21:27:49 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:10.858 21:27:49 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:10.858 21:27:49 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:10.858 21:27:49 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:10.858 21:27:49 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:10.858 21:27:49 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:10.858 21:27:49 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:10.858 21:27:49 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:10.858 21:27:49 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:10.858 21:27:49 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:10.858 21:27:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.858 21:27:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.858 21:27:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.858 21:27:49 -- paths/export.sh@5 -- # export PATH 00:05:10.858 21:27:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.858 21:27:49 -- nvmf/common.sh@46 -- # : 0 00:05:10.858 21:27:49 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:10.858 21:27:49 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:10.858 21:27:49 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:10.858 21:27:49 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:10.858 21:27:49 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:10.858 21:27:49 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:10.858 21:27:49 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:10.858 21:27:49 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:10.858 21:27:49 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:10.858 21:27:49 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:10.858 21:27:49 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:10.858 21:27:49 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:10.858 21:27:49 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:10.858 WARNING: No tests are enabled so not running JSON configuration tests 00:05:10.858 21:27:49 -- json_config/json_config.sh@27 -- # exit 0 00:05:10.858 00:05:10.858 real 0m0.103s 00:05:10.858 user 0m0.043s 00:05:10.858 sys 0m0.061s 00:05:10.858 21:27:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:10.858 21:27:49 -- common/autotest_common.sh@10 -- # set +x 00:05:10.858 ************************************ 00:05:10.858 END TEST json_config 00:05:10.858 ************************************ 00:05:10.858 21:27:49 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:10.858 21:27:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:10.858 21:27:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:10.858 21:27:49 -- common/autotest_common.sh@10 -- # set +x 00:05:10.858 ************************************ 00:05:10.858 START TEST json_config_extra_key 00:05:10.858 ************************************ 00:05:10.858 21:27:49 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:10.858 21:27:49 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:10.858 21:27:49 -- nvmf/common.sh@7 -- # uname -s 00:05:10.858 21:27:49 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:10.858 21:27:49 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:10.858 21:27:49 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:10.858 21:27:49 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:10.858 21:27:49 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:10.858 21:27:49 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:10.858 21:27:49 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:10.858 21:27:49 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:10.858 21:27:49 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:10.858 21:27:49 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:10.858 21:27:49 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:10.858 21:27:49 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:10.858 21:27:49 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:10.858 21:27:49 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:10.858 21:27:49 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:10.858 21:27:49 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:10.858 21:27:49 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:10.858 21:27:49 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:10.859 21:27:49 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:10.859 21:27:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.859 21:27:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.859 21:27:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.859 21:27:49 -- paths/export.sh@5 -- # export PATH 00:05:10.859 21:27:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:10.859 21:27:49 -- nvmf/common.sh@46 -- # : 0 00:05:10.859 21:27:49 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:11.118 21:27:49 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:11.118 21:27:49 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:11.118 21:27:49 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:11.118 21:27:49 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:11.118 21:27:49 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:11.118 21:27:49 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:11.118 21:27:49 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:11.118 INFO: launching applications... 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=3555213 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:11.118 Waiting for target to run... 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 3555213 /var/tmp/spdk_tgt.sock 00:05:11.118 21:27:49 -- common/autotest_common.sh@819 -- # '[' -z 3555213 ']' 00:05:11.118 21:27:49 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:11.118 21:27:49 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:11.118 21:27:49 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:11.118 21:27:49 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:11.118 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:11.118 21:27:49 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:11.118 21:27:49 -- common/autotest_common.sh@10 -- # set +x 00:05:11.118 [2024-07-12 21:27:49.672970] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:11.118 [2024-07-12 21:27:49.673039] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3555213 ] 00:05:11.118 EAL: No free 2048 kB hugepages reported on node 1 00:05:11.377 [2024-07-12 21:27:49.954532] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:11.377 [2024-07-12 21:27:50.018204] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:11.377 [2024-07-12 21:27:50.018318] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:11.946 21:27:50 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:11.946 21:27:50 -- common/autotest_common.sh@852 -- # return 0 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:11.946 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:11.946 INFO: shutting down applications... 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 3555213 ]] 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 3555213 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3555213 00:05:11.946 21:27:50 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3555213 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:12.515 SPDK target shutdown done 00:05:12.515 21:27:50 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:12.515 Success 00:05:12.515 00:05:12.515 real 0m1.447s 00:05:12.515 user 0m1.188s 00:05:12.515 sys 0m0.395s 00:05:12.515 21:27:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:12.515 21:27:50 -- common/autotest_common.sh@10 -- # set +x 00:05:12.515 ************************************ 00:05:12.515 END TEST json_config_extra_key 00:05:12.515 ************************************ 00:05:12.515 21:27:51 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:12.515 21:27:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:12.515 21:27:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:12.515 21:27:51 -- common/autotest_common.sh@10 -- # set +x 00:05:12.515 ************************************ 00:05:12.515 START TEST alias_rpc 00:05:12.515 ************************************ 00:05:12.515 21:27:51 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:12.515 * Looking for test storage... 00:05:12.515 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:12.515 21:27:51 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:12.515 21:27:51 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3555491 00:05:12.515 21:27:51 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3555491 00:05:12.515 21:27:51 -- common/autotest_common.sh@819 -- # '[' -z 3555491 ']' 00:05:12.515 21:27:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:12.515 21:27:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:12.515 21:27:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:12.515 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:12.515 21:27:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:12.515 21:27:51 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:12.515 21:27:51 -- common/autotest_common.sh@10 -- # set +x 00:05:12.515 [2024-07-12 21:27:51.144956] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:12.515 [2024-07-12 21:27:51.145031] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3555491 ] 00:05:12.515 EAL: No free 2048 kB hugepages reported on node 1 00:05:12.515 [2024-07-12 21:27:51.214956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:12.515 [2024-07-12 21:27:51.291976] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:12.515 [2024-07-12 21:27:51.292083] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:13.451 21:27:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:13.451 21:27:51 -- common/autotest_common.sh@852 -- # return 0 00:05:13.451 21:27:51 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:13.451 21:27:52 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3555491 00:05:13.451 21:27:52 -- common/autotest_common.sh@926 -- # '[' -z 3555491 ']' 00:05:13.451 21:27:52 -- common/autotest_common.sh@930 -- # kill -0 3555491 00:05:13.451 21:27:52 -- common/autotest_common.sh@931 -- # uname 00:05:13.451 21:27:52 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:13.451 21:27:52 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3555491 00:05:13.451 21:27:52 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:13.451 21:27:52 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:13.451 21:27:52 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3555491' 00:05:13.451 killing process with pid 3555491 00:05:13.451 21:27:52 -- common/autotest_common.sh@945 -- # kill 3555491 00:05:13.451 21:27:52 -- common/autotest_common.sh@950 -- # wait 3555491 00:05:14.020 00:05:14.020 real 0m1.463s 00:05:14.020 user 0m1.557s 00:05:14.020 sys 0m0.430s 00:05:14.020 21:27:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:14.020 21:27:52 -- common/autotest_common.sh@10 -- # set +x 00:05:14.020 ************************************ 00:05:14.020 END TEST alias_rpc 00:05:14.020 ************************************ 00:05:14.020 21:27:52 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:14.020 21:27:52 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:14.020 21:27:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:14.020 21:27:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:14.020 21:27:52 -- common/autotest_common.sh@10 -- # set +x 00:05:14.020 ************************************ 00:05:14.020 START TEST spdkcli_tcp 00:05:14.020 ************************************ 00:05:14.020 21:27:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:14.020 * Looking for test storage... 00:05:14.020 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:14.020 21:27:52 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:14.020 21:27:52 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:14.020 21:27:52 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:14.020 21:27:52 -- common/autotest_common.sh@10 -- # set +x 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3555812 00:05:14.020 21:27:52 -- spdkcli/tcp.sh@27 -- # waitforlisten 3555812 00:05:14.020 21:27:52 -- common/autotest_common.sh@819 -- # '[' -z 3555812 ']' 00:05:14.020 21:27:52 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.020 21:27:52 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:14.020 21:27:52 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.020 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.020 21:27:52 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:14.020 21:27:52 -- common/autotest_common.sh@10 -- # set +x 00:05:14.020 [2024-07-12 21:27:52.666049] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:14.020 [2024-07-12 21:27:52.666112] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3555812 ] 00:05:14.020 EAL: No free 2048 kB hugepages reported on node 1 00:05:14.020 [2024-07-12 21:27:52.732991] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:14.283 [2024-07-12 21:27:52.810042] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:14.283 [2024-07-12 21:27:52.810184] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:14.283 [2024-07-12 21:27:52.810187] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:14.850 21:27:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:14.850 21:27:53 -- common/autotest_common.sh@852 -- # return 0 00:05:14.850 21:27:53 -- spdkcli/tcp.sh@31 -- # socat_pid=3555994 00:05:14.850 21:27:53 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:14.850 21:27:53 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:15.110 [ 00:05:15.110 "spdk_get_version", 00:05:15.110 "rpc_get_methods", 00:05:15.110 "trace_get_info", 00:05:15.110 "trace_get_tpoint_group_mask", 00:05:15.110 "trace_disable_tpoint_group", 00:05:15.110 "trace_enable_tpoint_group", 00:05:15.110 "trace_clear_tpoint_mask", 00:05:15.110 "trace_set_tpoint_mask", 00:05:15.110 "vfu_tgt_set_base_path", 00:05:15.110 "framework_get_pci_devices", 00:05:15.110 "framework_get_config", 00:05:15.110 "framework_get_subsystems", 00:05:15.110 "iobuf_get_stats", 00:05:15.110 "iobuf_set_options", 00:05:15.110 "sock_set_default_impl", 00:05:15.110 "sock_impl_set_options", 00:05:15.110 "sock_impl_get_options", 00:05:15.110 "vmd_rescan", 00:05:15.110 "vmd_remove_device", 00:05:15.110 "vmd_enable", 00:05:15.110 "accel_get_stats", 00:05:15.110 "accel_set_options", 00:05:15.110 "accel_set_driver", 00:05:15.110 "accel_crypto_key_destroy", 00:05:15.110 "accel_crypto_keys_get", 00:05:15.110 "accel_crypto_key_create", 00:05:15.110 "accel_assign_opc", 00:05:15.110 "accel_get_module_info", 00:05:15.110 "accel_get_opc_assignments", 00:05:15.110 "notify_get_notifications", 00:05:15.110 "notify_get_types", 00:05:15.110 "bdev_get_histogram", 00:05:15.110 "bdev_enable_histogram", 00:05:15.110 "bdev_set_qos_limit", 00:05:15.110 "bdev_set_qd_sampling_period", 00:05:15.110 "bdev_get_bdevs", 00:05:15.110 "bdev_reset_iostat", 00:05:15.110 "bdev_get_iostat", 00:05:15.110 "bdev_examine", 00:05:15.110 "bdev_wait_for_examine", 00:05:15.110 "bdev_set_options", 00:05:15.110 "scsi_get_devices", 00:05:15.110 "thread_set_cpumask", 00:05:15.110 "framework_get_scheduler", 00:05:15.110 "framework_set_scheduler", 00:05:15.110 "framework_get_reactors", 00:05:15.110 "thread_get_io_channels", 00:05:15.110 "thread_get_pollers", 00:05:15.110 "thread_get_stats", 00:05:15.110 "framework_monitor_context_switch", 00:05:15.110 "spdk_kill_instance", 00:05:15.110 "log_enable_timestamps", 00:05:15.110 "log_get_flags", 00:05:15.110 "log_clear_flag", 00:05:15.110 "log_set_flag", 00:05:15.110 "log_get_level", 00:05:15.110 "log_set_level", 00:05:15.110 "log_get_print_level", 00:05:15.110 "log_set_print_level", 00:05:15.110 "framework_enable_cpumask_locks", 00:05:15.110 "framework_disable_cpumask_locks", 00:05:15.110 "framework_wait_init", 00:05:15.110 "framework_start_init", 00:05:15.110 "virtio_blk_create_transport", 00:05:15.110 "virtio_blk_get_transports", 00:05:15.110 "vhost_controller_set_coalescing", 00:05:15.110 "vhost_get_controllers", 00:05:15.110 "vhost_delete_controller", 00:05:15.110 "vhost_create_blk_controller", 00:05:15.110 "vhost_scsi_controller_remove_target", 00:05:15.110 "vhost_scsi_controller_add_target", 00:05:15.110 "vhost_start_scsi_controller", 00:05:15.110 "vhost_create_scsi_controller", 00:05:15.110 "ublk_recover_disk", 00:05:15.110 "ublk_get_disks", 00:05:15.110 "ublk_stop_disk", 00:05:15.110 "ublk_start_disk", 00:05:15.110 "ublk_destroy_target", 00:05:15.110 "ublk_create_target", 00:05:15.110 "nbd_get_disks", 00:05:15.110 "nbd_stop_disk", 00:05:15.110 "nbd_start_disk", 00:05:15.110 "env_dpdk_get_mem_stats", 00:05:15.110 "nvmf_subsystem_get_listeners", 00:05:15.110 "nvmf_subsystem_get_qpairs", 00:05:15.110 "nvmf_subsystem_get_controllers", 00:05:15.110 "nvmf_get_stats", 00:05:15.110 "nvmf_get_transports", 00:05:15.110 "nvmf_create_transport", 00:05:15.110 "nvmf_get_targets", 00:05:15.110 "nvmf_delete_target", 00:05:15.110 "nvmf_create_target", 00:05:15.110 "nvmf_subsystem_allow_any_host", 00:05:15.110 "nvmf_subsystem_remove_host", 00:05:15.111 "nvmf_subsystem_add_host", 00:05:15.111 "nvmf_subsystem_remove_ns", 00:05:15.111 "nvmf_subsystem_add_ns", 00:05:15.111 "nvmf_subsystem_listener_set_ana_state", 00:05:15.111 "nvmf_discovery_get_referrals", 00:05:15.111 "nvmf_discovery_remove_referral", 00:05:15.111 "nvmf_discovery_add_referral", 00:05:15.111 "nvmf_subsystem_remove_listener", 00:05:15.111 "nvmf_subsystem_add_listener", 00:05:15.111 "nvmf_delete_subsystem", 00:05:15.111 "nvmf_create_subsystem", 00:05:15.111 "nvmf_get_subsystems", 00:05:15.111 "nvmf_set_crdt", 00:05:15.111 "nvmf_set_config", 00:05:15.111 "nvmf_set_max_subsystems", 00:05:15.111 "iscsi_set_options", 00:05:15.111 "iscsi_get_auth_groups", 00:05:15.111 "iscsi_auth_group_remove_secret", 00:05:15.111 "iscsi_auth_group_add_secret", 00:05:15.111 "iscsi_delete_auth_group", 00:05:15.111 "iscsi_create_auth_group", 00:05:15.111 "iscsi_set_discovery_auth", 00:05:15.111 "iscsi_get_options", 00:05:15.111 "iscsi_target_node_request_logout", 00:05:15.111 "iscsi_target_node_set_redirect", 00:05:15.111 "iscsi_target_node_set_auth", 00:05:15.111 "iscsi_target_node_add_lun", 00:05:15.111 "iscsi_get_connections", 00:05:15.111 "iscsi_portal_group_set_auth", 00:05:15.111 "iscsi_start_portal_group", 00:05:15.111 "iscsi_delete_portal_group", 00:05:15.111 "iscsi_create_portal_group", 00:05:15.111 "iscsi_get_portal_groups", 00:05:15.111 "iscsi_delete_target_node", 00:05:15.111 "iscsi_target_node_remove_pg_ig_maps", 00:05:15.111 "iscsi_target_node_add_pg_ig_maps", 00:05:15.111 "iscsi_create_target_node", 00:05:15.111 "iscsi_get_target_nodes", 00:05:15.111 "iscsi_delete_initiator_group", 00:05:15.111 "iscsi_initiator_group_remove_initiators", 00:05:15.111 "iscsi_initiator_group_add_initiators", 00:05:15.111 "iscsi_create_initiator_group", 00:05:15.111 "iscsi_get_initiator_groups", 00:05:15.111 "vfu_virtio_create_scsi_endpoint", 00:05:15.111 "vfu_virtio_scsi_remove_target", 00:05:15.111 "vfu_virtio_scsi_add_target", 00:05:15.111 "vfu_virtio_create_blk_endpoint", 00:05:15.111 "vfu_virtio_delete_endpoint", 00:05:15.111 "iaa_scan_accel_module", 00:05:15.111 "dsa_scan_accel_module", 00:05:15.111 "ioat_scan_accel_module", 00:05:15.111 "accel_error_inject_error", 00:05:15.111 "bdev_iscsi_delete", 00:05:15.111 "bdev_iscsi_create", 00:05:15.111 "bdev_iscsi_set_options", 00:05:15.111 "bdev_virtio_attach_controller", 00:05:15.111 "bdev_virtio_scsi_get_devices", 00:05:15.111 "bdev_virtio_detach_controller", 00:05:15.111 "bdev_virtio_blk_set_hotplug", 00:05:15.111 "bdev_ftl_set_property", 00:05:15.111 "bdev_ftl_get_properties", 00:05:15.111 "bdev_ftl_get_stats", 00:05:15.111 "bdev_ftl_unmap", 00:05:15.111 "bdev_ftl_unload", 00:05:15.111 "bdev_ftl_delete", 00:05:15.111 "bdev_ftl_load", 00:05:15.111 "bdev_ftl_create", 00:05:15.111 "bdev_aio_delete", 00:05:15.111 "bdev_aio_rescan", 00:05:15.111 "bdev_aio_create", 00:05:15.111 "blobfs_create", 00:05:15.111 "blobfs_detect", 00:05:15.111 "blobfs_set_cache_size", 00:05:15.111 "bdev_zone_block_delete", 00:05:15.111 "bdev_zone_block_create", 00:05:15.111 "bdev_delay_delete", 00:05:15.111 "bdev_delay_create", 00:05:15.111 "bdev_delay_update_latency", 00:05:15.111 "bdev_split_delete", 00:05:15.111 "bdev_split_create", 00:05:15.111 "bdev_error_inject_error", 00:05:15.111 "bdev_error_delete", 00:05:15.111 "bdev_error_create", 00:05:15.111 "bdev_raid_set_options", 00:05:15.111 "bdev_raid_remove_base_bdev", 00:05:15.111 "bdev_raid_add_base_bdev", 00:05:15.111 "bdev_raid_delete", 00:05:15.111 "bdev_raid_create", 00:05:15.111 "bdev_raid_get_bdevs", 00:05:15.111 "bdev_lvol_grow_lvstore", 00:05:15.111 "bdev_lvol_get_lvols", 00:05:15.111 "bdev_lvol_get_lvstores", 00:05:15.111 "bdev_lvol_delete", 00:05:15.111 "bdev_lvol_set_read_only", 00:05:15.111 "bdev_lvol_resize", 00:05:15.111 "bdev_lvol_decouple_parent", 00:05:15.111 "bdev_lvol_inflate", 00:05:15.111 "bdev_lvol_rename", 00:05:15.111 "bdev_lvol_clone_bdev", 00:05:15.111 "bdev_lvol_clone", 00:05:15.111 "bdev_lvol_snapshot", 00:05:15.111 "bdev_lvol_create", 00:05:15.111 "bdev_lvol_delete_lvstore", 00:05:15.111 "bdev_lvol_rename_lvstore", 00:05:15.111 "bdev_lvol_create_lvstore", 00:05:15.111 "bdev_passthru_delete", 00:05:15.111 "bdev_passthru_create", 00:05:15.111 "bdev_nvme_cuse_unregister", 00:05:15.111 "bdev_nvme_cuse_register", 00:05:15.111 "bdev_opal_new_user", 00:05:15.111 "bdev_opal_set_lock_state", 00:05:15.111 "bdev_opal_delete", 00:05:15.111 "bdev_opal_get_info", 00:05:15.111 "bdev_opal_create", 00:05:15.111 "bdev_nvme_opal_revert", 00:05:15.111 "bdev_nvme_opal_init", 00:05:15.111 "bdev_nvme_send_cmd", 00:05:15.111 "bdev_nvme_get_path_iostat", 00:05:15.111 "bdev_nvme_get_mdns_discovery_info", 00:05:15.111 "bdev_nvme_stop_mdns_discovery", 00:05:15.111 "bdev_nvme_start_mdns_discovery", 00:05:15.111 "bdev_nvme_set_multipath_policy", 00:05:15.111 "bdev_nvme_set_preferred_path", 00:05:15.111 "bdev_nvme_get_io_paths", 00:05:15.111 "bdev_nvme_remove_error_injection", 00:05:15.111 "bdev_nvme_add_error_injection", 00:05:15.111 "bdev_nvme_get_discovery_info", 00:05:15.111 "bdev_nvme_stop_discovery", 00:05:15.111 "bdev_nvme_start_discovery", 00:05:15.111 "bdev_nvme_get_controller_health_info", 00:05:15.111 "bdev_nvme_disable_controller", 00:05:15.111 "bdev_nvme_enable_controller", 00:05:15.111 "bdev_nvme_reset_controller", 00:05:15.111 "bdev_nvme_get_transport_statistics", 00:05:15.111 "bdev_nvme_apply_firmware", 00:05:15.111 "bdev_nvme_detach_controller", 00:05:15.111 "bdev_nvme_get_controllers", 00:05:15.111 "bdev_nvme_attach_controller", 00:05:15.111 "bdev_nvme_set_hotplug", 00:05:15.111 "bdev_nvme_set_options", 00:05:15.111 "bdev_null_resize", 00:05:15.111 "bdev_null_delete", 00:05:15.111 "bdev_null_create", 00:05:15.111 "bdev_malloc_delete", 00:05:15.111 "bdev_malloc_create" 00:05:15.111 ] 00:05:15.111 21:27:53 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:15.111 21:27:53 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:15.111 21:27:53 -- common/autotest_common.sh@10 -- # set +x 00:05:15.111 21:27:53 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:15.111 21:27:53 -- spdkcli/tcp.sh@38 -- # killprocess 3555812 00:05:15.111 21:27:53 -- common/autotest_common.sh@926 -- # '[' -z 3555812 ']' 00:05:15.111 21:27:53 -- common/autotest_common.sh@930 -- # kill -0 3555812 00:05:15.111 21:27:53 -- common/autotest_common.sh@931 -- # uname 00:05:15.111 21:27:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:15.111 21:27:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3555812 00:05:15.111 21:27:53 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:15.111 21:27:53 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:15.111 21:27:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3555812' 00:05:15.111 killing process with pid 3555812 00:05:15.111 21:27:53 -- common/autotest_common.sh@945 -- # kill 3555812 00:05:15.111 21:27:53 -- common/autotest_common.sh@950 -- # wait 3555812 00:05:15.371 00:05:15.371 real 0m1.503s 00:05:15.371 user 0m2.778s 00:05:15.371 sys 0m0.472s 00:05:15.371 21:27:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.371 21:27:54 -- common/autotest_common.sh@10 -- # set +x 00:05:15.371 ************************************ 00:05:15.371 END TEST spdkcli_tcp 00:05:15.371 ************************************ 00:05:15.371 21:27:54 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:15.371 21:27:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:15.371 21:27:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:15.371 21:27:54 -- common/autotest_common.sh@10 -- # set +x 00:05:15.371 ************************************ 00:05:15.371 START TEST dpdk_mem_utility 00:05:15.371 ************************************ 00:05:15.371 21:27:54 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:15.630 * Looking for test storage... 00:05:15.630 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:15.630 21:27:54 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:15.630 21:27:54 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3556154 00:05:15.630 21:27:54 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:15.630 21:27:54 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3556154 00:05:15.630 21:27:54 -- common/autotest_common.sh@819 -- # '[' -z 3556154 ']' 00:05:15.630 21:27:54 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:15.630 21:27:54 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:15.630 21:27:54 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:15.630 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:15.630 21:27:54 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:15.630 21:27:54 -- common/autotest_common.sh@10 -- # set +x 00:05:15.630 [2024-07-12 21:27:54.218093] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:15.630 [2024-07-12 21:27:54.218172] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3556154 ] 00:05:15.630 EAL: No free 2048 kB hugepages reported on node 1 00:05:15.630 [2024-07-12 21:27:54.286943] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:15.630 [2024-07-12 21:27:54.363018] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:15.630 [2024-07-12 21:27:54.363144] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:16.568 21:27:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:16.568 21:27:55 -- common/autotest_common.sh@852 -- # return 0 00:05:16.568 21:27:55 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:16.568 21:27:55 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:16.568 21:27:55 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:16.568 21:27:55 -- common/autotest_common.sh@10 -- # set +x 00:05:16.568 { 00:05:16.568 "filename": "/tmp/spdk_mem_dump.txt" 00:05:16.568 } 00:05:16.568 21:27:55 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:16.568 21:27:55 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:16.568 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:16.568 1 heaps totaling size 814.000000 MiB 00:05:16.568 size: 814.000000 MiB heap id: 0 00:05:16.568 end heaps---------- 00:05:16.568 8 mempools totaling size 598.116089 MiB 00:05:16.568 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:16.568 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:16.568 size: 84.521057 MiB name: bdev_io_3556154 00:05:16.568 size: 51.011292 MiB name: evtpool_3556154 00:05:16.568 size: 50.003479 MiB name: msgpool_3556154 00:05:16.568 size: 21.763794 MiB name: PDU_Pool 00:05:16.568 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:16.568 size: 0.026123 MiB name: Session_Pool 00:05:16.568 end mempools------- 00:05:16.568 6 memzones totaling size 4.142822 MiB 00:05:16.568 size: 1.000366 MiB name: RG_ring_0_3556154 00:05:16.568 size: 1.000366 MiB name: RG_ring_1_3556154 00:05:16.568 size: 1.000366 MiB name: RG_ring_4_3556154 00:05:16.568 size: 1.000366 MiB name: RG_ring_5_3556154 00:05:16.568 size: 0.125366 MiB name: RG_ring_2_3556154 00:05:16.568 size: 0.015991 MiB name: RG_ring_3_3556154 00:05:16.568 end memzones------- 00:05:16.568 21:27:55 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:16.568 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:16.568 list of free elements. size: 12.519348 MiB 00:05:16.568 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:16.568 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:16.568 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:16.568 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:16.568 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:16.568 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:16.568 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:16.568 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:16.568 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:16.568 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:16.568 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:16.568 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:16.568 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:16.568 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:16.568 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:16.568 list of standard malloc elements. size: 199.218079 MiB 00:05:16.568 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:16.568 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:16.568 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:16.568 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:16.568 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:16.568 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:16.568 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:16.568 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:16.568 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:16.568 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:16.568 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:16.568 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:16.568 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:16.568 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:16.568 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:16.568 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:16.568 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:16.568 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:16.568 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:16.568 list of memzone associated elements. size: 602.262573 MiB 00:05:16.568 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:16.568 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:16.568 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:16.568 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:16.568 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:16.569 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3556154_0 00:05:16.569 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:16.569 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3556154_0 00:05:16.569 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:16.569 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3556154_0 00:05:16.569 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:16.569 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:16.569 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:16.569 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:16.569 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:16.569 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3556154 00:05:16.569 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:16.569 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3556154 00:05:16.569 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:16.569 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3556154 00:05:16.569 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:16.569 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:16.569 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:16.569 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:16.569 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:16.569 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:16.569 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:16.569 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:16.569 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:16.569 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3556154 00:05:16.569 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:16.569 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3556154 00:05:16.569 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:16.569 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3556154 00:05:16.569 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:16.569 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3556154 00:05:16.569 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:16.569 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3556154 00:05:16.569 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:16.569 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:16.569 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:16.569 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:16.569 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:16.569 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:16.569 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:16.569 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3556154 00:05:16.569 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:16.569 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:16.569 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:16.569 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:16.569 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:16.569 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3556154 00:05:16.569 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:16.569 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:16.569 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:16.569 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3556154 00:05:16.569 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:16.569 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3556154 00:05:16.569 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:16.569 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:16.569 21:27:55 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:16.569 21:27:55 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3556154 00:05:16.569 21:27:55 -- common/autotest_common.sh@926 -- # '[' -z 3556154 ']' 00:05:16.569 21:27:55 -- common/autotest_common.sh@930 -- # kill -0 3556154 00:05:16.569 21:27:55 -- common/autotest_common.sh@931 -- # uname 00:05:16.569 21:27:55 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:16.569 21:27:55 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3556154 00:05:16.569 21:27:55 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:16.569 21:27:55 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:16.569 21:27:55 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3556154' 00:05:16.569 killing process with pid 3556154 00:05:16.569 21:27:55 -- common/autotest_common.sh@945 -- # kill 3556154 00:05:16.569 21:27:55 -- common/autotest_common.sh@950 -- # wait 3556154 00:05:16.828 00:05:16.828 real 0m1.388s 00:05:16.828 user 0m1.420s 00:05:16.828 sys 0m0.426s 00:05:16.828 21:27:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:16.828 21:27:55 -- common/autotest_common.sh@10 -- # set +x 00:05:16.828 ************************************ 00:05:16.828 END TEST dpdk_mem_utility 00:05:16.828 ************************************ 00:05:16.828 21:27:55 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:16.828 21:27:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:16.828 21:27:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:16.828 21:27:55 -- common/autotest_common.sh@10 -- # set +x 00:05:16.828 ************************************ 00:05:16.828 START TEST event 00:05:16.828 ************************************ 00:05:16.828 21:27:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:16.828 * Looking for test storage... 00:05:16.828 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:16.828 21:27:55 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:16.828 21:27:55 -- bdev/nbd_common.sh@6 -- # set -e 00:05:16.828 21:27:55 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:17.088 21:27:55 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:17.088 21:27:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:17.088 21:27:55 -- common/autotest_common.sh@10 -- # set +x 00:05:17.088 ************************************ 00:05:17.088 START TEST event_perf 00:05:17.088 ************************************ 00:05:17.088 21:27:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:17.088 Running I/O for 1 seconds...[2024-07-12 21:27:55.635489] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:17.088 [2024-07-12 21:27:55.635582] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3556408 ] 00:05:17.088 EAL: No free 2048 kB hugepages reported on node 1 00:05:17.088 [2024-07-12 21:27:55.708440] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:17.088 [2024-07-12 21:27:55.781042] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:17.088 [2024-07-12 21:27:55.781139] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:17.088 [2024-07-12 21:27:55.781229] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:17.088 [2024-07-12 21:27:55.781231] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:18.467 Running I/O for 1 seconds... 00:05:18.467 lcore 0: 199539 00:05:18.467 lcore 1: 199538 00:05:18.467 lcore 2: 199538 00:05:18.467 lcore 3: 199539 00:05:18.467 done. 00:05:18.467 00:05:18.467 real 0m1.228s 00:05:18.467 user 0m4.131s 00:05:18.467 sys 0m0.094s 00:05:18.467 21:27:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:18.467 21:27:56 -- common/autotest_common.sh@10 -- # set +x 00:05:18.467 ************************************ 00:05:18.467 END TEST event_perf 00:05:18.467 ************************************ 00:05:18.467 21:27:56 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:18.467 21:27:56 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:18.467 21:27:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:18.467 21:27:56 -- common/autotest_common.sh@10 -- # set +x 00:05:18.467 ************************************ 00:05:18.467 START TEST event_reactor 00:05:18.467 ************************************ 00:05:18.467 21:27:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:18.467 [2024-07-12 21:27:56.911863] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:18.467 [2024-07-12 21:27:56.911966] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3556676 ] 00:05:18.467 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.467 [2024-07-12 21:27:56.982880] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.467 [2024-07-12 21:27:57.047627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.404 test_start 00:05:19.404 oneshot 00:05:19.404 tick 100 00:05:19.404 tick 100 00:05:19.404 tick 250 00:05:19.404 tick 100 00:05:19.404 tick 100 00:05:19.404 tick 250 00:05:19.404 tick 500 00:05:19.404 tick 100 00:05:19.404 tick 100 00:05:19.404 tick 100 00:05:19.404 tick 250 00:05:19.404 tick 100 00:05:19.404 tick 100 00:05:19.404 test_end 00:05:19.404 00:05:19.404 real 0m1.215s 00:05:19.404 user 0m1.125s 00:05:19.404 sys 0m0.086s 00:05:19.404 21:27:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:19.404 21:27:58 -- common/autotest_common.sh@10 -- # set +x 00:05:19.404 ************************************ 00:05:19.404 END TEST event_reactor 00:05:19.404 ************************************ 00:05:19.404 21:27:58 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:19.404 21:27:58 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:19.404 21:27:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:19.404 21:27:58 -- common/autotest_common.sh@10 -- # set +x 00:05:19.404 ************************************ 00:05:19.405 START TEST event_reactor_perf 00:05:19.405 ************************************ 00:05:19.405 21:27:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:19.405 [2024-07-12 21:27:58.177119] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:19.405 [2024-07-12 21:27:58.177221] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3556958 ] 00:05:19.664 EAL: No free 2048 kB hugepages reported on node 1 00:05:19.664 [2024-07-12 21:27:58.247484] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:19.664 [2024-07-12 21:27:58.312464] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:20.601 test_start 00:05:20.601 test_end 00:05:20.601 Performance: 927127 events per second 00:05:20.601 00:05:20.601 real 0m1.215s 00:05:20.601 user 0m1.129s 00:05:20.601 sys 0m0.082s 00:05:20.601 21:27:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:20.601 21:27:59 -- common/autotest_common.sh@10 -- # set +x 00:05:20.601 ************************************ 00:05:20.601 END TEST event_reactor_perf 00:05:20.601 ************************************ 00:05:20.860 21:27:59 -- event/event.sh@49 -- # uname -s 00:05:20.860 21:27:59 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:20.860 21:27:59 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:20.860 21:27:59 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:20.860 21:27:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:20.860 21:27:59 -- common/autotest_common.sh@10 -- # set +x 00:05:20.860 ************************************ 00:05:20.860 START TEST event_scheduler 00:05:20.860 ************************************ 00:05:20.860 21:27:59 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:20.860 * Looking for test storage... 00:05:20.860 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:20.860 21:27:59 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:20.860 21:27:59 -- scheduler/scheduler.sh@35 -- # scheduler_pid=3557277 00:05:20.861 21:27:59 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:20.861 21:27:59 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:20.861 21:27:59 -- scheduler/scheduler.sh@37 -- # waitforlisten 3557277 00:05:20.861 21:27:59 -- common/autotest_common.sh@819 -- # '[' -z 3557277 ']' 00:05:20.861 21:27:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:20.861 21:27:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:20.861 21:27:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:20.861 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:20.861 21:27:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:20.861 21:27:59 -- common/autotest_common.sh@10 -- # set +x 00:05:20.861 [2024-07-12 21:27:59.541681] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:20.861 [2024-07-12 21:27:59.541744] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3557277 ] 00:05:20.861 EAL: No free 2048 kB hugepages reported on node 1 00:05:20.861 [2024-07-12 21:27:59.602918] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:21.120 [2024-07-12 21:27:59.680071] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:21.120 [2024-07-12 21:27:59.680154] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:21.120 [2024-07-12 21:27:59.680258] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:21.120 [2024-07-12 21:27:59.680260] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:21.687 21:28:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:21.687 21:28:00 -- common/autotest_common.sh@852 -- # return 0 00:05:21.687 21:28:00 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:21.687 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.687 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.687 POWER: Env isn't set yet! 00:05:21.687 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:21.687 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:21.687 POWER: Cannot set governor of lcore 0 to userspace 00:05:21.687 POWER: Attempting to initialise PSTAT power management... 00:05:21.687 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:21.687 POWER: Initialized successfully for lcore 0 power management 00:05:21.687 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:21.687 POWER: Initialized successfully for lcore 1 power management 00:05:21.687 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:21.687 POWER: Initialized successfully for lcore 2 power management 00:05:21.687 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:21.687 POWER: Initialized successfully for lcore 3 power management 00:05:21.687 [2024-07-12 21:28:00.419614] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:21.687 [2024-07-12 21:28:00.419630] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:21.687 [2024-07-12 21:28:00.419641] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:21.687 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.687 21:28:00 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:21.687 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.687 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 [2024-07-12 21:28:00.486879] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:21.947 21:28:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:21.947 21:28:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 ************************************ 00:05:21.947 START TEST scheduler_create_thread 00:05:21.947 ************************************ 00:05:21.947 21:28:00 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 2 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 3 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 4 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 5 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 6 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 7 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 8 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 9 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 10 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:21.947 21:28:00 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:21.947 21:28:00 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:21.947 21:28:00 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:21.947 21:28:00 -- common/autotest_common.sh@10 -- # set +x 00:05:22.883 21:28:01 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:22.883 21:28:01 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:22.883 21:28:01 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:22.883 21:28:01 -- common/autotest_common.sh@10 -- # set +x 00:05:24.259 21:28:02 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:24.259 21:28:02 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:24.259 21:28:02 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:24.259 21:28:02 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:24.259 21:28:02 -- common/autotest_common.sh@10 -- # set +x 00:05:25.195 21:28:03 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:25.195 00:05:25.195 real 0m3.382s 00:05:25.195 user 0m0.025s 00:05:25.195 sys 0m0.006s 00:05:25.195 21:28:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.195 21:28:03 -- common/autotest_common.sh@10 -- # set +x 00:05:25.195 ************************************ 00:05:25.195 END TEST scheduler_create_thread 00:05:25.195 ************************************ 00:05:25.195 21:28:03 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:25.195 21:28:03 -- scheduler/scheduler.sh@46 -- # killprocess 3557277 00:05:25.195 21:28:03 -- common/autotest_common.sh@926 -- # '[' -z 3557277 ']' 00:05:25.195 21:28:03 -- common/autotest_common.sh@930 -- # kill -0 3557277 00:05:25.195 21:28:03 -- common/autotest_common.sh@931 -- # uname 00:05:25.195 21:28:03 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:25.195 21:28:03 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3557277 00:05:25.195 21:28:03 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:25.195 21:28:03 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:25.195 21:28:03 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3557277' 00:05:25.195 killing process with pid 3557277 00:05:25.195 21:28:03 -- common/autotest_common.sh@945 -- # kill 3557277 00:05:25.195 21:28:03 -- common/autotest_common.sh@950 -- # wait 3557277 00:05:25.763 [2024-07-12 21:28:04.258701] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:25.763 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:25.763 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:25.763 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:25.763 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:25.763 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:25.763 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:25.763 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:25.763 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:25.763 00:05:25.763 real 0m5.050s 00:05:25.763 user 0m10.470s 00:05:25.763 sys 0m0.398s 00:05:25.763 21:28:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.763 21:28:04 -- common/autotest_common.sh@10 -- # set +x 00:05:25.763 ************************************ 00:05:25.763 END TEST event_scheduler 00:05:25.763 ************************************ 00:05:25.763 21:28:04 -- event/event.sh@51 -- # modprobe -n nbd 00:05:25.763 21:28:04 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:25.763 21:28:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:25.763 21:28:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:25.763 21:28:04 -- common/autotest_common.sh@10 -- # set +x 00:05:25.763 ************************************ 00:05:25.763 START TEST app_repeat 00:05:25.763 ************************************ 00:05:25.763 21:28:04 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:25.763 21:28:04 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:25.763 21:28:04 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:25.763 21:28:04 -- event/event.sh@13 -- # local nbd_list 00:05:25.763 21:28:04 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:25.763 21:28:04 -- event/event.sh@14 -- # local bdev_list 00:05:25.763 21:28:04 -- event/event.sh@15 -- # local repeat_times=4 00:05:25.763 21:28:04 -- event/event.sh@17 -- # modprobe nbd 00:05:26.022 21:28:04 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:26.022 21:28:04 -- event/event.sh@19 -- # repeat_pid=3558141 00:05:26.022 21:28:04 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:26.022 21:28:04 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3558141' 00:05:26.022 Process app_repeat pid: 3558141 00:05:26.022 21:28:04 -- event/event.sh@23 -- # for i in {0..2} 00:05:26.022 21:28:04 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:26.022 spdk_app_start Round 0 00:05:26.022 21:28:04 -- event/event.sh@25 -- # waitforlisten 3558141 /var/tmp/spdk-nbd.sock 00:05:26.022 21:28:04 -- common/autotest_common.sh@819 -- # '[' -z 3558141 ']' 00:05:26.022 21:28:04 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:26.022 21:28:04 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:26.022 21:28:04 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:26.022 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:26.022 21:28:04 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:26.022 21:28:04 -- common/autotest_common.sh@10 -- # set +x 00:05:26.022 [2024-07-12 21:28:04.556314] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:26.022 [2024-07-12 21:28:04.556396] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3558141 ] 00:05:26.022 EAL: No free 2048 kB hugepages reported on node 1 00:05:26.022 [2024-07-12 21:28:04.625195] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:26.022 [2024-07-12 21:28:04.691880] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:26.022 [2024-07-12 21:28:04.691882] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:26.960 21:28:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:26.960 21:28:05 -- common/autotest_common.sh@852 -- # return 0 00:05:26.960 21:28:05 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:26.960 Malloc0 00:05:26.960 21:28:05 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:26.960 Malloc1 00:05:26.960 21:28:05 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@12 -- # local i 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:26.960 21:28:05 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:27.218 /dev/nbd0 00:05:27.218 21:28:05 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:27.218 21:28:05 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:27.218 21:28:05 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:27.218 21:28:05 -- common/autotest_common.sh@857 -- # local i 00:05:27.218 21:28:05 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:27.218 21:28:05 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:27.218 21:28:05 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:27.218 21:28:05 -- common/autotest_common.sh@861 -- # break 00:05:27.218 21:28:05 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:27.218 21:28:05 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:27.218 21:28:05 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:27.218 1+0 records in 00:05:27.218 1+0 records out 00:05:27.218 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217304 s, 18.8 MB/s 00:05:27.218 21:28:05 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.218 21:28:05 -- common/autotest_common.sh@874 -- # size=4096 00:05:27.218 21:28:05 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.218 21:28:05 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:27.218 21:28:05 -- common/autotest_common.sh@877 -- # return 0 00:05:27.218 21:28:05 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:27.218 21:28:05 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.218 21:28:05 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:27.477 /dev/nbd1 00:05:27.477 21:28:06 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:27.477 21:28:06 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:27.477 21:28:06 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:27.477 21:28:06 -- common/autotest_common.sh@857 -- # local i 00:05:27.477 21:28:06 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:27.477 21:28:06 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:27.477 21:28:06 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:27.477 21:28:06 -- common/autotest_common.sh@861 -- # break 00:05:27.477 21:28:06 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:27.477 21:28:06 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:27.477 21:28:06 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:27.477 1+0 records in 00:05:27.477 1+0 records out 00:05:27.477 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236322 s, 17.3 MB/s 00:05:27.477 21:28:06 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.477 21:28:06 -- common/autotest_common.sh@874 -- # size=4096 00:05:27.477 21:28:06 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:27.477 21:28:06 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:27.477 21:28:06 -- common/autotest_common.sh@877 -- # return 0 00:05:27.477 21:28:06 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:27.477 21:28:06 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.477 21:28:06 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:27.477 21:28:06 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.477 21:28:06 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:27.735 { 00:05:27.735 "nbd_device": "/dev/nbd0", 00:05:27.735 "bdev_name": "Malloc0" 00:05:27.735 }, 00:05:27.735 { 00:05:27.735 "nbd_device": "/dev/nbd1", 00:05:27.735 "bdev_name": "Malloc1" 00:05:27.735 } 00:05:27.735 ]' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:27.735 { 00:05:27.735 "nbd_device": "/dev/nbd0", 00:05:27.735 "bdev_name": "Malloc0" 00:05:27.735 }, 00:05:27.735 { 00:05:27.735 "nbd_device": "/dev/nbd1", 00:05:27.735 "bdev_name": "Malloc1" 00:05:27.735 } 00:05:27.735 ]' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:27.735 /dev/nbd1' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:27.735 /dev/nbd1' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@65 -- # count=2 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@95 -- # count=2 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:27.735 256+0 records in 00:05:27.735 256+0 records out 00:05:27.735 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114967 s, 91.2 MB/s 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:27.735 256+0 records in 00:05:27.735 256+0 records out 00:05:27.735 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0204449 s, 51.3 MB/s 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:27.735 256+0 records in 00:05:27.735 256+0 records out 00:05:27.735 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0218215 s, 48.1 MB/s 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@51 -- # local i 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:27.735 21:28:06 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@41 -- # break 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@45 -- # return 0 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:27.994 21:28:06 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@41 -- # break 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@45 -- # return 0 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:28.253 21:28:06 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:28.253 21:28:07 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:28.253 21:28:07 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@65 -- # true 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@65 -- # count=0 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@104 -- # count=0 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:28.512 21:28:07 -- bdev/nbd_common.sh@109 -- # return 0 00:05:28.512 21:28:07 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:28.513 21:28:07 -- event/event.sh@35 -- # sleep 3 00:05:28.772 [2024-07-12 21:28:07.418757] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:28.772 [2024-07-12 21:28:07.482382] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:28.772 [2024-07-12 21:28:07.482384] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:28.772 [2024-07-12 21:28:07.523061] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:28.772 [2024-07-12 21:28:07.523104] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:32.059 21:28:10 -- event/event.sh@23 -- # for i in {0..2} 00:05:32.059 21:28:10 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:32.059 spdk_app_start Round 1 00:05:32.059 21:28:10 -- event/event.sh@25 -- # waitforlisten 3558141 /var/tmp/spdk-nbd.sock 00:05:32.059 21:28:10 -- common/autotest_common.sh@819 -- # '[' -z 3558141 ']' 00:05:32.059 21:28:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:32.059 21:28:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:32.059 21:28:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:32.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:32.059 21:28:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:32.059 21:28:10 -- common/autotest_common.sh@10 -- # set +x 00:05:32.059 21:28:10 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:32.059 21:28:10 -- common/autotest_common.sh@852 -- # return 0 00:05:32.059 21:28:10 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.059 Malloc0 00:05:32.059 21:28:10 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.059 Malloc1 00:05:32.059 21:28:10 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@12 -- # local i 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:32.059 21:28:10 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:32.318 /dev/nbd0 00:05:32.318 21:28:10 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:32.318 21:28:10 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:32.318 21:28:10 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:32.318 21:28:10 -- common/autotest_common.sh@857 -- # local i 00:05:32.318 21:28:10 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:32.318 21:28:10 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:32.318 21:28:10 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:32.318 21:28:10 -- common/autotest_common.sh@861 -- # break 00:05:32.318 21:28:10 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:32.318 21:28:10 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:32.318 21:28:10 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:32.318 1+0 records in 00:05:32.318 1+0 records out 00:05:32.318 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252786 s, 16.2 MB/s 00:05:32.318 21:28:10 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:32.318 21:28:10 -- common/autotest_common.sh@874 -- # size=4096 00:05:32.318 21:28:10 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:32.318 21:28:10 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:32.318 21:28:10 -- common/autotest_common.sh@877 -- # return 0 00:05:32.318 21:28:10 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:32.318 21:28:10 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:32.318 21:28:10 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:32.577 /dev/nbd1 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:32.577 21:28:11 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:32.577 21:28:11 -- common/autotest_common.sh@857 -- # local i 00:05:32.577 21:28:11 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:32.577 21:28:11 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:32.577 21:28:11 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:32.577 21:28:11 -- common/autotest_common.sh@861 -- # break 00:05:32.577 21:28:11 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:32.577 21:28:11 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:32.577 21:28:11 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:32.577 1+0 records in 00:05:32.577 1+0 records out 00:05:32.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254085 s, 16.1 MB/s 00:05:32.577 21:28:11 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:32.577 21:28:11 -- common/autotest_common.sh@874 -- # size=4096 00:05:32.577 21:28:11 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:32.577 21:28:11 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:32.577 21:28:11 -- common/autotest_common.sh@877 -- # return 0 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:32.577 { 00:05:32.577 "nbd_device": "/dev/nbd0", 00:05:32.577 "bdev_name": "Malloc0" 00:05:32.577 }, 00:05:32.577 { 00:05:32.577 "nbd_device": "/dev/nbd1", 00:05:32.577 "bdev_name": "Malloc1" 00:05:32.577 } 00:05:32.577 ]' 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:32.577 { 00:05:32.577 "nbd_device": "/dev/nbd0", 00:05:32.577 "bdev_name": "Malloc0" 00:05:32.577 }, 00:05:32.577 { 00:05:32.577 "nbd_device": "/dev/nbd1", 00:05:32.577 "bdev_name": "Malloc1" 00:05:32.577 } 00:05:32.577 ]' 00:05:32.577 21:28:11 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:32.836 /dev/nbd1' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:32.836 /dev/nbd1' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@65 -- # count=2 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@95 -- # count=2 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:32.836 256+0 records in 00:05:32.836 256+0 records out 00:05:32.836 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112649 s, 93.1 MB/s 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:32.836 256+0 records in 00:05:32.836 256+0 records out 00:05:32.836 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206962 s, 50.7 MB/s 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:32.836 256+0 records in 00:05:32.836 256+0 records out 00:05:32.836 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215922 s, 48.6 MB/s 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@51 -- # local i 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:32.836 21:28:11 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@41 -- # break 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@45 -- # return 0 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@41 -- # break 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@45 -- # return 0 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.095 21:28:11 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@65 -- # true 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@65 -- # count=0 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@104 -- # count=0 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:33.353 21:28:12 -- bdev/nbd_common.sh@109 -- # return 0 00:05:33.353 21:28:12 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:33.612 21:28:12 -- event/event.sh@35 -- # sleep 3 00:05:33.871 [2024-07-12 21:28:12.450318] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:33.871 [2024-07-12 21:28:12.513907] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:33.871 [2024-07-12 21:28:12.513908] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.871 [2024-07-12 21:28:12.554621] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:33.871 [2024-07-12 21:28:12.554664] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:37.159 21:28:15 -- event/event.sh@23 -- # for i in {0..2} 00:05:37.159 21:28:15 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:37.159 spdk_app_start Round 2 00:05:37.159 21:28:15 -- event/event.sh@25 -- # waitforlisten 3558141 /var/tmp/spdk-nbd.sock 00:05:37.159 21:28:15 -- common/autotest_common.sh@819 -- # '[' -z 3558141 ']' 00:05:37.159 21:28:15 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:37.159 21:28:15 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:37.159 21:28:15 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:37.159 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:37.159 21:28:15 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:37.159 21:28:15 -- common/autotest_common.sh@10 -- # set +x 00:05:37.159 21:28:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:37.159 21:28:15 -- common/autotest_common.sh@852 -- # return 0 00:05:37.159 21:28:15 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:37.159 Malloc0 00:05:37.159 21:28:15 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:37.159 Malloc1 00:05:37.159 21:28:15 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@12 -- # local i 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:37.159 21:28:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:37.418 /dev/nbd0 00:05:37.418 21:28:15 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:37.418 21:28:15 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:37.418 21:28:15 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:37.418 21:28:15 -- common/autotest_common.sh@857 -- # local i 00:05:37.418 21:28:15 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:37.418 21:28:15 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:37.418 21:28:15 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:37.418 21:28:15 -- common/autotest_common.sh@861 -- # break 00:05:37.418 21:28:15 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:37.418 21:28:15 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:37.418 21:28:15 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:37.418 1+0 records in 00:05:37.418 1+0 records out 00:05:37.418 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218282 s, 18.8 MB/s 00:05:37.418 21:28:15 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:37.418 21:28:15 -- common/autotest_common.sh@874 -- # size=4096 00:05:37.418 21:28:15 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:37.418 21:28:15 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:37.418 21:28:15 -- common/autotest_common.sh@877 -- # return 0 00:05:37.418 21:28:15 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:37.418 21:28:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:37.418 21:28:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:37.418 /dev/nbd1 00:05:37.418 21:28:16 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:37.418 21:28:16 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:37.418 21:28:16 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:37.418 21:28:16 -- common/autotest_common.sh@857 -- # local i 00:05:37.418 21:28:16 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:37.418 21:28:16 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:37.418 21:28:16 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:37.418 21:28:16 -- common/autotest_common.sh@861 -- # break 00:05:37.418 21:28:16 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:37.418 21:28:16 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:37.418 21:28:16 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:37.418 1+0 records in 00:05:37.418 1+0 records out 00:05:37.418 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252473 s, 16.2 MB/s 00:05:37.418 21:28:16 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:37.418 21:28:16 -- common/autotest_common.sh@874 -- # size=4096 00:05:37.418 21:28:16 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:37.418 21:28:16 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:37.418 21:28:16 -- common/autotest_common.sh@877 -- # return 0 00:05:37.418 21:28:16 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:37.418 21:28:16 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:37.418 21:28:16 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:37.418 21:28:16 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:37.418 21:28:16 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:37.677 { 00:05:37.677 "nbd_device": "/dev/nbd0", 00:05:37.677 "bdev_name": "Malloc0" 00:05:37.677 }, 00:05:37.677 { 00:05:37.677 "nbd_device": "/dev/nbd1", 00:05:37.677 "bdev_name": "Malloc1" 00:05:37.677 } 00:05:37.677 ]' 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:37.677 { 00:05:37.677 "nbd_device": "/dev/nbd0", 00:05:37.677 "bdev_name": "Malloc0" 00:05:37.677 }, 00:05:37.677 { 00:05:37.677 "nbd_device": "/dev/nbd1", 00:05:37.677 "bdev_name": "Malloc1" 00:05:37.677 } 00:05:37.677 ]' 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:37.677 /dev/nbd1' 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:37.677 /dev/nbd1' 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@65 -- # count=2 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@95 -- # count=2 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:37.677 256+0 records in 00:05:37.677 256+0 records out 00:05:37.677 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114112 s, 91.9 MB/s 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:37.677 256+0 records in 00:05:37.677 256+0 records out 00:05:37.677 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203452 s, 51.5 MB/s 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:37.677 21:28:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:37.936 256+0 records in 00:05:37.936 256+0 records out 00:05:37.936 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0216656 s, 48.4 MB/s 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:37.936 21:28:16 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@51 -- # local i 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@41 -- # break 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@45 -- # return 0 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:37.937 21:28:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@41 -- # break 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@45 -- # return 0 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.223 21:28:16 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@65 -- # true 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@65 -- # count=0 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@104 -- # count=0 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:38.506 21:28:17 -- bdev/nbd_common.sh@109 -- # return 0 00:05:38.506 21:28:17 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:38.506 21:28:17 -- event/event.sh@35 -- # sleep 3 00:05:38.765 [2024-07-12 21:28:17.454593] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:38.765 [2024-07-12 21:28:17.518255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:38.765 [2024-07-12 21:28:17.518256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.025 [2024-07-12 21:28:17.559015] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:39.025 [2024-07-12 21:28:17.559053] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:41.558 21:28:20 -- event/event.sh@38 -- # waitforlisten 3558141 /var/tmp/spdk-nbd.sock 00:05:41.558 21:28:20 -- common/autotest_common.sh@819 -- # '[' -z 3558141 ']' 00:05:41.558 21:28:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:41.558 21:28:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:41.558 21:28:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:41.558 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:41.558 21:28:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:41.558 21:28:20 -- common/autotest_common.sh@10 -- # set +x 00:05:41.816 21:28:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:41.816 21:28:20 -- common/autotest_common.sh@852 -- # return 0 00:05:41.816 21:28:20 -- event/event.sh@39 -- # killprocess 3558141 00:05:41.816 21:28:20 -- common/autotest_common.sh@926 -- # '[' -z 3558141 ']' 00:05:41.816 21:28:20 -- common/autotest_common.sh@930 -- # kill -0 3558141 00:05:41.816 21:28:20 -- common/autotest_common.sh@931 -- # uname 00:05:41.816 21:28:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:41.816 21:28:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3558141 00:05:41.816 21:28:20 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:41.816 21:28:20 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:41.816 21:28:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3558141' 00:05:41.817 killing process with pid 3558141 00:05:41.817 21:28:20 -- common/autotest_common.sh@945 -- # kill 3558141 00:05:41.817 21:28:20 -- common/autotest_common.sh@950 -- # wait 3558141 00:05:42.075 spdk_app_start is called in Round 0. 00:05:42.075 Shutdown signal received, stop current app iteration 00:05:42.075 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:42.075 spdk_app_start is called in Round 1. 00:05:42.075 Shutdown signal received, stop current app iteration 00:05:42.075 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:42.075 spdk_app_start is called in Round 2. 00:05:42.075 Shutdown signal received, stop current app iteration 00:05:42.076 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:05:42.076 spdk_app_start is called in Round 3. 00:05:42.076 Shutdown signal received, stop current app iteration 00:05:42.076 21:28:20 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:42.076 21:28:20 -- event/event.sh@42 -- # return 0 00:05:42.076 00:05:42.076 real 0m16.122s 00:05:42.076 user 0m34.205s 00:05:42.076 sys 0m3.000s 00:05:42.076 21:28:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.076 21:28:20 -- common/autotest_common.sh@10 -- # set +x 00:05:42.076 ************************************ 00:05:42.076 END TEST app_repeat 00:05:42.076 ************************************ 00:05:42.076 21:28:20 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:42.076 21:28:20 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:42.076 21:28:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.076 21:28:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.076 21:28:20 -- common/autotest_common.sh@10 -- # set +x 00:05:42.076 ************************************ 00:05:42.076 START TEST cpu_locks 00:05:42.076 ************************************ 00:05:42.076 21:28:20 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:42.076 * Looking for test storage... 00:05:42.076 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:42.076 21:28:20 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:42.076 21:28:20 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:42.076 21:28:20 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:42.076 21:28:20 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:42.076 21:28:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.076 21:28:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.076 21:28:20 -- common/autotest_common.sh@10 -- # set +x 00:05:42.076 ************************************ 00:05:42.076 START TEST default_locks 00:05:42.076 ************************************ 00:05:42.076 21:28:20 -- common/autotest_common.sh@1104 -- # default_locks 00:05:42.076 21:28:20 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=3561337 00:05:42.076 21:28:20 -- event/cpu_locks.sh@47 -- # waitforlisten 3561337 00:05:42.076 21:28:20 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:42.076 21:28:20 -- common/autotest_common.sh@819 -- # '[' -z 3561337 ']' 00:05:42.076 21:28:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.076 21:28:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:42.076 21:28:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.076 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.076 21:28:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:42.076 21:28:20 -- common/autotest_common.sh@10 -- # set +x 00:05:42.076 [2024-07-12 21:28:20.836292] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:42.076 [2024-07-12 21:28:20.836389] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3561337 ] 00:05:42.334 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.335 [2024-07-12 21:28:20.904588] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.335 [2024-07-12 21:28:20.974566] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:42.335 [2024-07-12 21:28:20.974680] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.901 21:28:21 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:42.901 21:28:21 -- common/autotest_common.sh@852 -- # return 0 00:05:42.901 21:28:21 -- event/cpu_locks.sh@49 -- # locks_exist 3561337 00:05:42.901 21:28:21 -- event/cpu_locks.sh@22 -- # lslocks -p 3561337 00:05:42.901 21:28:21 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:43.468 lslocks: write error 00:05:43.468 21:28:22 -- event/cpu_locks.sh@50 -- # killprocess 3561337 00:05:43.468 21:28:22 -- common/autotest_common.sh@926 -- # '[' -z 3561337 ']' 00:05:43.468 21:28:22 -- common/autotest_common.sh@930 -- # kill -0 3561337 00:05:43.468 21:28:22 -- common/autotest_common.sh@931 -- # uname 00:05:43.468 21:28:22 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:43.468 21:28:22 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3561337 00:05:43.468 21:28:22 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:43.468 21:28:22 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:43.468 21:28:22 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3561337' 00:05:43.468 killing process with pid 3561337 00:05:43.468 21:28:22 -- common/autotest_common.sh@945 -- # kill 3561337 00:05:43.468 21:28:22 -- common/autotest_common.sh@950 -- # wait 3561337 00:05:43.727 21:28:22 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 3561337 00:05:43.727 21:28:22 -- common/autotest_common.sh@640 -- # local es=0 00:05:43.727 21:28:22 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3561337 00:05:43.727 21:28:22 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:43.727 21:28:22 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:43.727 21:28:22 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:43.727 21:28:22 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:43.727 21:28:22 -- common/autotest_common.sh@643 -- # waitforlisten 3561337 00:05:43.727 21:28:22 -- common/autotest_common.sh@819 -- # '[' -z 3561337 ']' 00:05:43.727 21:28:22 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:43.727 21:28:22 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:43.727 21:28:22 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:43.727 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:43.727 21:28:22 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:43.727 21:28:22 -- common/autotest_common.sh@10 -- # set +x 00:05:43.727 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3561337) - No such process 00:05:43.727 ERROR: process (pid: 3561337) is no longer running 00:05:43.727 21:28:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:43.727 21:28:22 -- common/autotest_common.sh@852 -- # return 1 00:05:43.727 21:28:22 -- common/autotest_common.sh@643 -- # es=1 00:05:43.727 21:28:22 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:43.727 21:28:22 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:43.727 21:28:22 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:43.727 21:28:22 -- event/cpu_locks.sh@54 -- # no_locks 00:05:43.727 21:28:22 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:43.727 21:28:22 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:43.727 21:28:22 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:43.727 00:05:43.727 real 0m1.640s 00:05:43.727 user 0m1.702s 00:05:43.727 sys 0m0.589s 00:05:43.727 21:28:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.727 21:28:22 -- common/autotest_common.sh@10 -- # set +x 00:05:43.727 ************************************ 00:05:43.727 END TEST default_locks 00:05:43.727 ************************************ 00:05:43.727 21:28:22 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:43.727 21:28:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:43.727 21:28:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:43.727 21:28:22 -- common/autotest_common.sh@10 -- # set +x 00:05:43.727 ************************************ 00:05:43.727 START TEST default_locks_via_rpc 00:05:43.727 ************************************ 00:05:43.727 21:28:22 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:05:43.727 21:28:22 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=3561634 00:05:43.727 21:28:22 -- event/cpu_locks.sh@63 -- # waitforlisten 3561634 00:05:43.727 21:28:22 -- common/autotest_common.sh@819 -- # '[' -z 3561634 ']' 00:05:43.727 21:28:22 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:43.727 21:28:22 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:43.727 21:28:22 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:43.727 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:43.727 21:28:22 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:43.727 21:28:22 -- common/autotest_common.sh@10 -- # set +x 00:05:43.727 21:28:22 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:43.987 [2024-07-12 21:28:22.517960] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:43.987 [2024-07-12 21:28:22.518024] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3561634 ] 00:05:43.987 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.987 [2024-07-12 21:28:22.585490] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.987 [2024-07-12 21:28:22.660476] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:43.987 [2024-07-12 21:28:22.660596] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.571 21:28:23 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:44.572 21:28:23 -- common/autotest_common.sh@852 -- # return 0 00:05:44.572 21:28:23 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:44.572 21:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:44.572 21:28:23 -- common/autotest_common.sh@10 -- # set +x 00:05:44.572 21:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:44.572 21:28:23 -- event/cpu_locks.sh@67 -- # no_locks 00:05:44.572 21:28:23 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:44.572 21:28:23 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:44.572 21:28:23 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:44.572 21:28:23 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:44.572 21:28:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:44.572 21:28:23 -- common/autotest_common.sh@10 -- # set +x 00:05:44.572 21:28:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:44.572 21:28:23 -- event/cpu_locks.sh@71 -- # locks_exist 3561634 00:05:44.572 21:28:23 -- event/cpu_locks.sh@22 -- # lslocks -p 3561634 00:05:44.572 21:28:23 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:44.830 21:28:23 -- event/cpu_locks.sh@73 -- # killprocess 3561634 00:05:44.830 21:28:23 -- common/autotest_common.sh@926 -- # '[' -z 3561634 ']' 00:05:44.830 21:28:23 -- common/autotest_common.sh@930 -- # kill -0 3561634 00:05:44.830 21:28:23 -- common/autotest_common.sh@931 -- # uname 00:05:44.830 21:28:23 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:44.830 21:28:23 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3561634 00:05:44.830 21:28:23 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:44.830 21:28:23 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:44.830 21:28:23 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3561634' 00:05:44.830 killing process with pid 3561634 00:05:44.830 21:28:23 -- common/autotest_common.sh@945 -- # kill 3561634 00:05:44.830 21:28:23 -- common/autotest_common.sh@950 -- # wait 3561634 00:05:45.398 00:05:45.398 real 0m1.412s 00:05:45.398 user 0m1.462s 00:05:45.398 sys 0m0.467s 00:05:45.398 21:28:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:45.398 21:28:23 -- common/autotest_common.sh@10 -- # set +x 00:05:45.398 ************************************ 00:05:45.398 END TEST default_locks_via_rpc 00:05:45.398 ************************************ 00:05:45.398 21:28:23 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:45.398 21:28:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:45.398 21:28:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.398 21:28:23 -- common/autotest_common.sh@10 -- # set +x 00:05:45.398 ************************************ 00:05:45.398 START TEST non_locking_app_on_locked_coremask 00:05:45.398 ************************************ 00:05:45.398 21:28:23 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:05:45.398 21:28:23 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=3561935 00:05:45.398 21:28:23 -- event/cpu_locks.sh@81 -- # waitforlisten 3561935 /var/tmp/spdk.sock 00:05:45.398 21:28:23 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:45.398 21:28:23 -- common/autotest_common.sh@819 -- # '[' -z 3561935 ']' 00:05:45.398 21:28:23 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:45.398 21:28:23 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:45.399 21:28:23 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:45.399 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:45.399 21:28:23 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:45.399 21:28:23 -- common/autotest_common.sh@10 -- # set +x 00:05:45.399 [2024-07-12 21:28:23.977047] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:45.399 [2024-07-12 21:28:23.977123] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3561935 ] 00:05:45.399 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.399 [2024-07-12 21:28:24.044330] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.399 [2024-07-12 21:28:24.109337] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:45.399 [2024-07-12 21:28:24.109473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.335 21:28:24 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:46.335 21:28:24 -- common/autotest_common.sh@852 -- # return 0 00:05:46.335 21:28:24 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=3561949 00:05:46.335 21:28:24 -- event/cpu_locks.sh@85 -- # waitforlisten 3561949 /var/tmp/spdk2.sock 00:05:46.335 21:28:24 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:46.335 21:28:24 -- common/autotest_common.sh@819 -- # '[' -z 3561949 ']' 00:05:46.335 21:28:24 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:46.335 21:28:24 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:46.335 21:28:24 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:46.335 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:46.335 21:28:24 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:46.335 21:28:24 -- common/autotest_common.sh@10 -- # set +x 00:05:46.335 [2024-07-12 21:28:24.809839] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:46.336 [2024-07-12 21:28:24.809914] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3561949 ] 00:05:46.336 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.336 [2024-07-12 21:28:24.907426] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:46.336 [2024-07-12 21:28:24.907475] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.336 [2024-07-12 21:28:25.050495] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:46.336 [2024-07-12 21:28:25.050611] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.903 21:28:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:46.903 21:28:25 -- common/autotest_common.sh@852 -- # return 0 00:05:46.903 21:28:25 -- event/cpu_locks.sh@87 -- # locks_exist 3561935 00:05:46.903 21:28:25 -- event/cpu_locks.sh@22 -- # lslocks -p 3561935 00:05:46.903 21:28:25 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:48.279 lslocks: write error 00:05:48.279 21:28:26 -- event/cpu_locks.sh@89 -- # killprocess 3561935 00:05:48.279 21:28:26 -- common/autotest_common.sh@926 -- # '[' -z 3561935 ']' 00:05:48.279 21:28:26 -- common/autotest_common.sh@930 -- # kill -0 3561935 00:05:48.279 21:28:26 -- common/autotest_common.sh@931 -- # uname 00:05:48.279 21:28:26 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:48.279 21:28:26 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3561935 00:05:48.279 21:28:26 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:48.279 21:28:26 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:48.279 21:28:26 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3561935' 00:05:48.279 killing process with pid 3561935 00:05:48.279 21:28:26 -- common/autotest_common.sh@945 -- # kill 3561935 00:05:48.279 21:28:26 -- common/autotest_common.sh@950 -- # wait 3561935 00:05:48.845 21:28:27 -- event/cpu_locks.sh@90 -- # killprocess 3561949 00:05:48.845 21:28:27 -- common/autotest_common.sh@926 -- # '[' -z 3561949 ']' 00:05:48.845 21:28:27 -- common/autotest_common.sh@930 -- # kill -0 3561949 00:05:48.845 21:28:27 -- common/autotest_common.sh@931 -- # uname 00:05:48.845 21:28:27 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:48.845 21:28:27 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3561949 00:05:48.845 21:28:27 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:48.845 21:28:27 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:48.845 21:28:27 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3561949' 00:05:48.845 killing process with pid 3561949 00:05:48.845 21:28:27 -- common/autotest_common.sh@945 -- # kill 3561949 00:05:48.845 21:28:27 -- common/autotest_common.sh@950 -- # wait 3561949 00:05:49.103 00:05:49.103 real 0m3.923s 00:05:49.103 user 0m4.170s 00:05:49.103 sys 0m1.285s 00:05:49.103 21:28:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.103 21:28:27 -- common/autotest_common.sh@10 -- # set +x 00:05:49.103 ************************************ 00:05:49.103 END TEST non_locking_app_on_locked_coremask 00:05:49.103 ************************************ 00:05:49.361 21:28:27 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:49.361 21:28:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.361 21:28:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.361 21:28:27 -- common/autotest_common.sh@10 -- # set +x 00:05:49.361 ************************************ 00:05:49.361 START TEST locking_app_on_unlocked_coremask 00:05:49.361 ************************************ 00:05:49.361 21:28:27 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:05:49.361 21:28:27 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=3562547 00:05:49.361 21:28:27 -- event/cpu_locks.sh@99 -- # waitforlisten 3562547 /var/tmp/spdk.sock 00:05:49.361 21:28:27 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:49.361 21:28:27 -- common/autotest_common.sh@819 -- # '[' -z 3562547 ']' 00:05:49.361 21:28:27 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.361 21:28:27 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:49.361 21:28:27 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.361 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.361 21:28:27 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:49.361 21:28:27 -- common/autotest_common.sh@10 -- # set +x 00:05:49.361 [2024-07-12 21:28:27.951839] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:49.361 [2024-07-12 21:28:27.951931] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3562547 ] 00:05:49.361 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.361 [2024-07-12 21:28:28.021914] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:49.361 [2024-07-12 21:28:28.021941] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.361 [2024-07-12 21:28:28.092778] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:49.361 [2024-07-12 21:28:28.092893] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.296 21:28:28 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:50.296 21:28:28 -- common/autotest_common.sh@852 -- # return 0 00:05:50.296 21:28:28 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:50.296 21:28:28 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=3562786 00:05:50.296 21:28:28 -- event/cpu_locks.sh@103 -- # waitforlisten 3562786 /var/tmp/spdk2.sock 00:05:50.296 21:28:28 -- common/autotest_common.sh@819 -- # '[' -z 3562786 ']' 00:05:50.296 21:28:28 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:50.296 21:28:28 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:50.296 21:28:28 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:50.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:50.296 21:28:28 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:50.296 21:28:28 -- common/autotest_common.sh@10 -- # set +x 00:05:50.296 [2024-07-12 21:28:28.780846] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:50.296 [2024-07-12 21:28:28.780940] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3562786 ] 00:05:50.296 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.296 [2024-07-12 21:28:28.871202] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.296 [2024-07-12 21:28:29.012970] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:50.296 [2024-07-12 21:28:29.013099] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.863 21:28:29 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:50.863 21:28:29 -- common/autotest_common.sh@852 -- # return 0 00:05:50.863 21:28:29 -- event/cpu_locks.sh@105 -- # locks_exist 3562786 00:05:50.863 21:28:29 -- event/cpu_locks.sh@22 -- # lslocks -p 3562786 00:05:50.863 21:28:29 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:51.799 lslocks: write error 00:05:51.799 21:28:30 -- event/cpu_locks.sh@107 -- # killprocess 3562547 00:05:51.799 21:28:30 -- common/autotest_common.sh@926 -- # '[' -z 3562547 ']' 00:05:51.799 21:28:30 -- common/autotest_common.sh@930 -- # kill -0 3562547 00:05:51.799 21:28:30 -- common/autotest_common.sh@931 -- # uname 00:05:51.799 21:28:30 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:52.059 21:28:30 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3562547 00:05:52.059 21:28:30 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:52.059 21:28:30 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:52.059 21:28:30 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3562547' 00:05:52.059 killing process with pid 3562547 00:05:52.059 21:28:30 -- common/autotest_common.sh@945 -- # kill 3562547 00:05:52.059 21:28:30 -- common/autotest_common.sh@950 -- # wait 3562547 00:05:52.628 21:28:31 -- event/cpu_locks.sh@108 -- # killprocess 3562786 00:05:52.628 21:28:31 -- common/autotest_common.sh@926 -- # '[' -z 3562786 ']' 00:05:52.628 21:28:31 -- common/autotest_common.sh@930 -- # kill -0 3562786 00:05:52.628 21:28:31 -- common/autotest_common.sh@931 -- # uname 00:05:52.628 21:28:31 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:52.628 21:28:31 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3562786 00:05:52.628 21:28:31 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:52.628 21:28:31 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:52.628 21:28:31 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3562786' 00:05:52.628 killing process with pid 3562786 00:05:52.628 21:28:31 -- common/autotest_common.sh@945 -- # kill 3562786 00:05:52.628 21:28:31 -- common/autotest_common.sh@950 -- # wait 3562786 00:05:52.887 00:05:52.887 real 0m3.656s 00:05:52.887 user 0m3.875s 00:05:52.887 sys 0m1.160s 00:05:52.887 21:28:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:52.887 21:28:31 -- common/autotest_common.sh@10 -- # set +x 00:05:52.887 ************************************ 00:05:52.887 END TEST locking_app_on_unlocked_coremask 00:05:52.887 ************************************ 00:05:52.887 21:28:31 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:52.887 21:28:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:52.887 21:28:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:52.887 21:28:31 -- common/autotest_common.sh@10 -- # set +x 00:05:52.887 ************************************ 00:05:52.887 START TEST locking_app_on_locked_coremask 00:05:52.887 ************************************ 00:05:52.887 21:28:31 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:05:52.887 21:28:31 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=3563362 00:05:52.887 21:28:31 -- event/cpu_locks.sh@116 -- # waitforlisten 3563362 /var/tmp/spdk.sock 00:05:52.887 21:28:31 -- common/autotest_common.sh@819 -- # '[' -z 3563362 ']' 00:05:52.887 21:28:31 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:52.887 21:28:31 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:52.887 21:28:31 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:52.887 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:52.887 21:28:31 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:52.887 21:28:31 -- common/autotest_common.sh@10 -- # set +x 00:05:52.887 21:28:31 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:52.887 [2024-07-12 21:28:31.647035] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:52.887 [2024-07-12 21:28:31.647107] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3563362 ] 00:05:53.146 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.146 [2024-07-12 21:28:31.714643] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.146 [2024-07-12 21:28:31.790148] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:53.146 [2024-07-12 21:28:31.790255] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.715 21:28:32 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:53.715 21:28:32 -- common/autotest_common.sh@852 -- # return 0 00:05:53.715 21:28:32 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=3563381 00:05:53.715 21:28:32 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 3563381 /var/tmp/spdk2.sock 00:05:53.715 21:28:32 -- common/autotest_common.sh@640 -- # local es=0 00:05:53.715 21:28:32 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3563381 /var/tmp/spdk2.sock 00:05:53.715 21:28:32 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:53.715 21:28:32 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:53.715 21:28:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:53.715 21:28:32 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:53.715 21:28:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:53.715 21:28:32 -- common/autotest_common.sh@643 -- # waitforlisten 3563381 /var/tmp/spdk2.sock 00:05:53.715 21:28:32 -- common/autotest_common.sh@819 -- # '[' -z 3563381 ']' 00:05:53.715 21:28:32 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:53.715 21:28:32 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:53.715 21:28:32 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:53.715 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:53.715 21:28:32 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:53.715 21:28:32 -- common/autotest_common.sh@10 -- # set +x 00:05:53.715 [2024-07-12 21:28:32.466115] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:53.715 [2024-07-12 21:28:32.466171] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3563381 ] 00:05:53.974 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.974 [2024-07-12 21:28:32.559091] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 3563362 has claimed it. 00:05:53.974 [2024-07-12 21:28:32.559127] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:54.553 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3563381) - No such process 00:05:54.553 ERROR: process (pid: 3563381) is no longer running 00:05:54.553 21:28:33 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:54.553 21:28:33 -- common/autotest_common.sh@852 -- # return 1 00:05:54.553 21:28:33 -- common/autotest_common.sh@643 -- # es=1 00:05:54.553 21:28:33 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:54.553 21:28:33 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:54.553 21:28:33 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:54.553 21:28:33 -- event/cpu_locks.sh@122 -- # locks_exist 3563362 00:05:54.553 21:28:33 -- event/cpu_locks.sh@22 -- # lslocks -p 3563362 00:05:54.553 21:28:33 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:54.813 lslocks: write error 00:05:54.813 21:28:33 -- event/cpu_locks.sh@124 -- # killprocess 3563362 00:05:54.813 21:28:33 -- common/autotest_common.sh@926 -- # '[' -z 3563362 ']' 00:05:54.813 21:28:33 -- common/autotest_common.sh@930 -- # kill -0 3563362 00:05:54.813 21:28:33 -- common/autotest_common.sh@931 -- # uname 00:05:54.813 21:28:33 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:54.813 21:28:33 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3563362 00:05:54.813 21:28:33 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:54.813 21:28:33 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:54.813 21:28:33 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3563362' 00:05:54.813 killing process with pid 3563362 00:05:54.813 21:28:33 -- common/autotest_common.sh@945 -- # kill 3563362 00:05:54.813 21:28:33 -- common/autotest_common.sh@950 -- # wait 3563362 00:05:55.073 00:05:55.073 real 0m2.203s 00:05:55.073 user 0m2.375s 00:05:55.073 sys 0m0.617s 00:05:55.073 21:28:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:55.073 21:28:33 -- common/autotest_common.sh@10 -- # set +x 00:05:55.073 ************************************ 00:05:55.073 END TEST locking_app_on_locked_coremask 00:05:55.073 ************************************ 00:05:55.332 21:28:33 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:55.332 21:28:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:55.332 21:28:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:55.332 21:28:33 -- common/autotest_common.sh@10 -- # set +x 00:05:55.332 ************************************ 00:05:55.332 START TEST locking_overlapped_coremask 00:05:55.332 ************************************ 00:05:55.332 21:28:33 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:05:55.332 21:28:33 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=3563671 00:05:55.332 21:28:33 -- event/cpu_locks.sh@133 -- # waitforlisten 3563671 /var/tmp/spdk.sock 00:05:55.332 21:28:33 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:55.332 21:28:33 -- common/autotest_common.sh@819 -- # '[' -z 3563671 ']' 00:05:55.332 21:28:33 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:55.332 21:28:33 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:55.332 21:28:33 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:55.332 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:55.332 21:28:33 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:55.332 21:28:33 -- common/autotest_common.sh@10 -- # set +x 00:05:55.332 [2024-07-12 21:28:33.896569] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:55.332 [2024-07-12 21:28:33.896648] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3563671 ] 00:05:55.332 EAL: No free 2048 kB hugepages reported on node 1 00:05:55.332 [2024-07-12 21:28:33.965699] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:55.332 [2024-07-12 21:28:34.041477] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:55.332 [2024-07-12 21:28:34.045460] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:55.332 [2024-07-12 21:28:34.045479] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:55.332 [2024-07-12 21:28:34.045484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.270 21:28:34 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:56.270 21:28:34 -- common/autotest_common.sh@852 -- # return 0 00:05:56.270 21:28:34 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=3563939 00:05:56.270 21:28:34 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 3563939 /var/tmp/spdk2.sock 00:05:56.270 21:28:34 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:56.270 21:28:34 -- common/autotest_common.sh@640 -- # local es=0 00:05:56.270 21:28:34 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 3563939 /var/tmp/spdk2.sock 00:05:56.270 21:28:34 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:05:56.270 21:28:34 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:56.270 21:28:34 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:05:56.270 21:28:34 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:56.270 21:28:34 -- common/autotest_common.sh@643 -- # waitforlisten 3563939 /var/tmp/spdk2.sock 00:05:56.270 21:28:34 -- common/autotest_common.sh@819 -- # '[' -z 3563939 ']' 00:05:56.270 21:28:34 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:56.270 21:28:34 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:56.270 21:28:34 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:56.270 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:56.270 21:28:34 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:56.270 21:28:34 -- common/autotest_common.sh@10 -- # set +x 00:05:56.270 [2024-07-12 21:28:34.722189] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:56.270 [2024-07-12 21:28:34.722274] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3563939 ] 00:05:56.270 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.270 [2024-07-12 21:28:34.816479] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3563671 has claimed it. 00:05:56.270 [2024-07-12 21:28:34.816519] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:56.839 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (3563939) - No such process 00:05:56.839 ERROR: process (pid: 3563939) is no longer running 00:05:56.839 21:28:35 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:56.839 21:28:35 -- common/autotest_common.sh@852 -- # return 1 00:05:56.839 21:28:35 -- common/autotest_common.sh@643 -- # es=1 00:05:56.839 21:28:35 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:56.839 21:28:35 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:56.839 21:28:35 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:56.839 21:28:35 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:56.839 21:28:35 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:56.839 21:28:35 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:56.839 21:28:35 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:56.839 21:28:35 -- event/cpu_locks.sh@141 -- # killprocess 3563671 00:05:56.839 21:28:35 -- common/autotest_common.sh@926 -- # '[' -z 3563671 ']' 00:05:56.839 21:28:35 -- common/autotest_common.sh@930 -- # kill -0 3563671 00:05:56.839 21:28:35 -- common/autotest_common.sh@931 -- # uname 00:05:56.839 21:28:35 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:56.839 21:28:35 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3563671 00:05:56.839 21:28:35 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:56.839 21:28:35 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:56.839 21:28:35 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3563671' 00:05:56.839 killing process with pid 3563671 00:05:56.839 21:28:35 -- common/autotest_common.sh@945 -- # kill 3563671 00:05:56.839 21:28:35 -- common/autotest_common.sh@950 -- # wait 3563671 00:05:57.099 00:05:57.099 real 0m1.851s 00:05:57.099 user 0m5.196s 00:05:57.099 sys 0m0.419s 00:05:57.099 21:28:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:57.099 21:28:35 -- common/autotest_common.sh@10 -- # set +x 00:05:57.099 ************************************ 00:05:57.099 END TEST locking_overlapped_coremask 00:05:57.099 ************************************ 00:05:57.099 21:28:35 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:05:57.099 21:28:35 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:57.099 21:28:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:57.099 21:28:35 -- common/autotest_common.sh@10 -- # set +x 00:05:57.099 ************************************ 00:05:57.099 START TEST locking_overlapped_coremask_via_rpc 00:05:57.099 ************************************ 00:05:57.099 21:28:35 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:05:57.099 21:28:35 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=3564085 00:05:57.099 21:28:35 -- event/cpu_locks.sh@149 -- # waitforlisten 3564085 /var/tmp/spdk.sock 00:05:57.099 21:28:35 -- common/autotest_common.sh@819 -- # '[' -z 3564085 ']' 00:05:57.099 21:28:35 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:57.099 21:28:35 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:57.099 21:28:35 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:57.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:57.099 21:28:35 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:57.099 21:28:35 -- common/autotest_common.sh@10 -- # set +x 00:05:57.099 21:28:35 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:05:57.099 [2024-07-12 21:28:35.794412] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:57.099 [2024-07-12 21:28:35.794511] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3564085 ] 00:05:57.099 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.099 [2024-07-12 21:28:35.863616] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:57.099 [2024-07-12 21:28:35.863642] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:57.359 [2024-07-12 21:28:35.941286] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:57.359 [2024-07-12 21:28:35.941436] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:57.359 [2024-07-12 21:28:35.941529] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:57.359 [2024-07-12 21:28:35.941533] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.927 21:28:36 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:57.927 21:28:36 -- common/autotest_common.sh@852 -- # return 0 00:05:57.927 21:28:36 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=3564255 00:05:57.927 21:28:36 -- event/cpu_locks.sh@153 -- # waitforlisten 3564255 /var/tmp/spdk2.sock 00:05:57.927 21:28:36 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:05:57.927 21:28:36 -- common/autotest_common.sh@819 -- # '[' -z 3564255 ']' 00:05:57.927 21:28:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:57.927 21:28:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:57.927 21:28:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:57.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:57.927 21:28:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:57.927 21:28:36 -- common/autotest_common.sh@10 -- # set +x 00:05:57.927 [2024-07-12 21:28:36.637211] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:57.927 [2024-07-12 21:28:36.637276] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3564255 ] 00:05:57.927 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.186 [2024-07-12 21:28:36.729759] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:58.186 [2024-07-12 21:28:36.729785] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:58.186 [2024-07-12 21:28:36.880860] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:58.186 [2024-07-12 21:28:36.881013] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:58.186 [2024-07-12 21:28:36.881150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:58.186 [2024-07-12 21:28:36.881152] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:05:58.755 21:28:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:58.755 21:28:37 -- common/autotest_common.sh@852 -- # return 0 00:05:58.755 21:28:37 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:05:58.755 21:28:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:58.755 21:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:58.755 21:28:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:58.755 21:28:37 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:58.755 21:28:37 -- common/autotest_common.sh@640 -- # local es=0 00:05:58.755 21:28:37 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:58.755 21:28:37 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:05:58.755 21:28:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:58.755 21:28:37 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:05:58.755 21:28:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:05:58.755 21:28:37 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:58.755 21:28:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:58.755 21:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:58.755 [2024-07-12 21:28:37.477506] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3564085 has claimed it. 00:05:58.755 request: 00:05:58.755 { 00:05:58.755 "method": "framework_enable_cpumask_locks", 00:05:58.755 "req_id": 1 00:05:58.755 } 00:05:58.755 Got JSON-RPC error response 00:05:58.755 response: 00:05:58.755 { 00:05:58.755 "code": -32603, 00:05:58.755 "message": "Failed to claim CPU core: 2" 00:05:58.755 } 00:05:58.755 21:28:37 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:05:58.755 21:28:37 -- common/autotest_common.sh@643 -- # es=1 00:05:58.755 21:28:37 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:05:58.755 21:28:37 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:05:58.755 21:28:37 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:05:58.755 21:28:37 -- event/cpu_locks.sh@158 -- # waitforlisten 3564085 /var/tmp/spdk.sock 00:05:58.755 21:28:37 -- common/autotest_common.sh@819 -- # '[' -z 3564085 ']' 00:05:58.755 21:28:37 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:58.755 21:28:37 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:58.755 21:28:37 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:58.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:58.755 21:28:37 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:58.755 21:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:59.014 21:28:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:59.014 21:28:37 -- common/autotest_common.sh@852 -- # return 0 00:05:59.014 21:28:37 -- event/cpu_locks.sh@159 -- # waitforlisten 3564255 /var/tmp/spdk2.sock 00:05:59.014 21:28:37 -- common/autotest_common.sh@819 -- # '[' -z 3564255 ']' 00:05:59.014 21:28:37 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:59.014 21:28:37 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:59.014 21:28:37 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:59.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:59.014 21:28:37 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:59.014 21:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:59.274 21:28:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:59.274 21:28:37 -- common/autotest_common.sh@852 -- # return 0 00:05:59.274 21:28:37 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:05:59.274 21:28:37 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:59.274 21:28:37 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:59.274 21:28:37 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:59.274 00:05:59.274 real 0m2.085s 00:05:59.274 user 0m0.804s 00:05:59.274 sys 0m0.210s 00:05:59.274 21:28:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.274 21:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:59.274 ************************************ 00:05:59.274 END TEST locking_overlapped_coremask_via_rpc 00:05:59.274 ************************************ 00:05:59.274 21:28:37 -- event/cpu_locks.sh@174 -- # cleanup 00:05:59.274 21:28:37 -- event/cpu_locks.sh@15 -- # [[ -z 3564085 ]] 00:05:59.274 21:28:37 -- event/cpu_locks.sh@15 -- # killprocess 3564085 00:05:59.274 21:28:37 -- common/autotest_common.sh@926 -- # '[' -z 3564085 ']' 00:05:59.274 21:28:37 -- common/autotest_common.sh@930 -- # kill -0 3564085 00:05:59.274 21:28:37 -- common/autotest_common.sh@931 -- # uname 00:05:59.274 21:28:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:59.274 21:28:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3564085 00:05:59.274 21:28:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:59.274 21:28:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:59.274 21:28:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3564085' 00:05:59.274 killing process with pid 3564085 00:05:59.274 21:28:37 -- common/autotest_common.sh@945 -- # kill 3564085 00:05:59.274 21:28:37 -- common/autotest_common.sh@950 -- # wait 3564085 00:05:59.534 21:28:38 -- event/cpu_locks.sh@16 -- # [[ -z 3564255 ]] 00:05:59.534 21:28:38 -- event/cpu_locks.sh@16 -- # killprocess 3564255 00:05:59.534 21:28:38 -- common/autotest_common.sh@926 -- # '[' -z 3564255 ']' 00:05:59.534 21:28:38 -- common/autotest_common.sh@930 -- # kill -0 3564255 00:05:59.534 21:28:38 -- common/autotest_common.sh@931 -- # uname 00:05:59.534 21:28:38 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:59.534 21:28:38 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3564255 00:05:59.793 21:28:38 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:59.793 21:28:38 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:59.793 21:28:38 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3564255' 00:05:59.793 killing process with pid 3564255 00:05:59.793 21:28:38 -- common/autotest_common.sh@945 -- # kill 3564255 00:05:59.793 21:28:38 -- common/autotest_common.sh@950 -- # wait 3564255 00:06:00.054 21:28:38 -- event/cpu_locks.sh@18 -- # rm -f 00:06:00.054 21:28:38 -- event/cpu_locks.sh@1 -- # cleanup 00:06:00.054 21:28:38 -- event/cpu_locks.sh@15 -- # [[ -z 3564085 ]] 00:06:00.054 21:28:38 -- event/cpu_locks.sh@15 -- # killprocess 3564085 00:06:00.054 21:28:38 -- common/autotest_common.sh@926 -- # '[' -z 3564085 ']' 00:06:00.054 21:28:38 -- common/autotest_common.sh@930 -- # kill -0 3564085 00:06:00.054 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (3564085) - No such process 00:06:00.054 21:28:38 -- common/autotest_common.sh@953 -- # echo 'Process with pid 3564085 is not found' 00:06:00.054 Process with pid 3564085 is not found 00:06:00.054 21:28:38 -- event/cpu_locks.sh@16 -- # [[ -z 3564255 ]] 00:06:00.054 21:28:38 -- event/cpu_locks.sh@16 -- # killprocess 3564255 00:06:00.054 21:28:38 -- common/autotest_common.sh@926 -- # '[' -z 3564255 ']' 00:06:00.054 21:28:38 -- common/autotest_common.sh@930 -- # kill -0 3564255 00:06:00.054 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (3564255) - No such process 00:06:00.054 21:28:38 -- common/autotest_common.sh@953 -- # echo 'Process with pid 3564255 is not found' 00:06:00.054 Process with pid 3564255 is not found 00:06:00.054 21:28:38 -- event/cpu_locks.sh@18 -- # rm -f 00:06:00.054 00:06:00.054 real 0m17.938s 00:06:00.054 user 0m30.070s 00:06:00.054 sys 0m5.653s 00:06:00.054 21:28:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:00.054 21:28:38 -- common/autotest_common.sh@10 -- # set +x 00:06:00.054 ************************************ 00:06:00.054 END TEST cpu_locks 00:06:00.054 ************************************ 00:06:00.054 00:06:00.054 real 0m43.146s 00:06:00.054 user 1m21.254s 00:06:00.054 sys 0m9.623s 00:06:00.054 21:28:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:00.054 21:28:38 -- common/autotest_common.sh@10 -- # set +x 00:06:00.054 ************************************ 00:06:00.054 END TEST event 00:06:00.054 ************************************ 00:06:00.054 21:28:38 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:00.054 21:28:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:00.054 21:28:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:00.054 21:28:38 -- common/autotest_common.sh@10 -- # set +x 00:06:00.054 ************************************ 00:06:00.054 START TEST thread 00:06:00.054 ************************************ 00:06:00.054 21:28:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:00.054 * Looking for test storage... 00:06:00.054 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:00.054 21:28:38 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:00.054 21:28:38 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:00.054 21:28:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:00.054 21:28:38 -- common/autotest_common.sh@10 -- # set +x 00:06:00.314 ************************************ 00:06:00.314 START TEST thread_poller_perf 00:06:00.314 ************************************ 00:06:00.314 21:28:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:00.314 [2024-07-12 21:28:38.854912] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:00.315 [2024-07-12 21:28:38.855010] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3564719 ] 00:06:00.315 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.315 [2024-07-12 21:28:38.927569] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:00.315 [2024-07-12 21:28:38.996895] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.315 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:01.693 ====================================== 00:06:01.693 busy:2505415762 (cyc) 00:06:01.693 total_run_count: 812000 00:06:01.693 tsc_hz: 2500000000 (cyc) 00:06:01.693 ====================================== 00:06:01.693 poller_cost: 3085 (cyc), 1234 (nsec) 00:06:01.693 00:06:01.693 real 0m1.228s 00:06:01.693 user 0m1.123s 00:06:01.693 sys 0m0.100s 00:06:01.693 21:28:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.693 21:28:40 -- common/autotest_common.sh@10 -- # set +x 00:06:01.693 ************************************ 00:06:01.693 END TEST thread_poller_perf 00:06:01.693 ************************************ 00:06:01.693 21:28:40 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:01.693 21:28:40 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:01.693 21:28:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:01.693 21:28:40 -- common/autotest_common.sh@10 -- # set +x 00:06:01.693 ************************************ 00:06:01.693 START TEST thread_poller_perf 00:06:01.693 ************************************ 00:06:01.693 21:28:40 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:01.693 [2024-07-12 21:28:40.130608] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:01.693 [2024-07-12 21:28:40.130721] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3564908 ] 00:06:01.693 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.693 [2024-07-12 21:28:40.203565] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.693 [2024-07-12 21:28:40.271136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.693 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:02.630 ====================================== 00:06:02.630 busy:2501995130 (cyc) 00:06:02.630 total_run_count: 14386000 00:06:02.630 tsc_hz: 2500000000 (cyc) 00:06:02.630 ====================================== 00:06:02.630 poller_cost: 173 (cyc), 69 (nsec) 00:06:02.630 00:06:02.630 real 0m1.221s 00:06:02.630 user 0m1.130s 00:06:02.630 sys 0m0.087s 00:06:02.630 21:28:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:02.630 21:28:41 -- common/autotest_common.sh@10 -- # set +x 00:06:02.630 ************************************ 00:06:02.630 END TEST thread_poller_perf 00:06:02.630 ************************************ 00:06:02.630 21:28:41 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:02.630 21:28:41 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:02.630 21:28:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:02.630 21:28:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:02.630 21:28:41 -- common/autotest_common.sh@10 -- # set +x 00:06:02.630 ************************************ 00:06:02.630 START TEST thread_spdk_lock 00:06:02.630 ************************************ 00:06:02.630 21:28:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:02.630 [2024-07-12 21:28:41.394823] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:02.630 [2024-07-12 21:28:41.394917] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3565201 ] 00:06:02.890 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.890 [2024-07-12 21:28:41.464594] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:02.890 [2024-07-12 21:28:41.531105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:02.890 [2024-07-12 21:28:41.531107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.457 [2024-07-12 21:28:42.017516] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:03.457 [2024-07-12 21:28:42.017560] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:03.457 [2024-07-12 21:28:42.017571] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x149c080 00:06:03.457 [2024-07-12 21:28:42.018459] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:03.457 [2024-07-12 21:28:42.018564] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:03.457 [2024-07-12 21:28:42.018583] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:03.457 Starting test contend 00:06:03.457 Worker Delay Wait us Hold us Total us 00:06:03.457 0 3 167624 184815 352440 00:06:03.457 1 5 86732 284217 370950 00:06:03.457 PASS test contend 00:06:03.457 Starting test hold_by_poller 00:06:03.457 PASS test hold_by_poller 00:06:03.457 Starting test hold_by_message 00:06:03.457 PASS test hold_by_message 00:06:03.457 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:03.457 100014 assertions passed 00:06:03.457 0 assertions failed 00:06:03.457 00:06:03.457 real 0m0.701s 00:06:03.457 user 0m1.101s 00:06:03.457 sys 0m0.084s 00:06:03.457 21:28:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.457 21:28:42 -- common/autotest_common.sh@10 -- # set +x 00:06:03.457 ************************************ 00:06:03.457 END TEST thread_spdk_lock 00:06:03.457 ************************************ 00:06:03.457 00:06:03.457 real 0m3.386s 00:06:03.457 user 0m3.433s 00:06:03.457 sys 0m0.464s 00:06:03.457 21:28:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.457 21:28:42 -- common/autotest_common.sh@10 -- # set +x 00:06:03.457 ************************************ 00:06:03.457 END TEST thread 00:06:03.457 ************************************ 00:06:03.457 21:28:42 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:03.457 21:28:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:03.457 21:28:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:03.457 21:28:42 -- common/autotest_common.sh@10 -- # set +x 00:06:03.457 ************************************ 00:06:03.457 START TEST accel 00:06:03.457 ************************************ 00:06:03.457 21:28:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:03.716 * Looking for test storage... 00:06:03.716 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:03.716 21:28:42 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:03.716 21:28:42 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:03.716 21:28:42 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:03.716 21:28:42 -- accel/accel.sh@59 -- # spdk_tgt_pid=3565519 00:06:03.716 21:28:42 -- accel/accel.sh@60 -- # waitforlisten 3565519 00:06:03.716 21:28:42 -- common/autotest_common.sh@819 -- # '[' -z 3565519 ']' 00:06:03.716 21:28:42 -- accel/accel.sh@58 -- # build_accel_config 00:06:03.716 21:28:42 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:03.716 21:28:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:03.716 21:28:42 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:03.716 21:28:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.716 21:28:42 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:03.716 21:28:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.716 21:28:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:03.716 21:28:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:03.716 21:28:42 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:03.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:03.716 21:28:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:03.716 21:28:42 -- accel/accel.sh@42 -- # jq -r . 00:06:03.716 21:28:42 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:03.716 21:28:42 -- common/autotest_common.sh@10 -- # set +x 00:06:03.716 [2024-07-12 21:28:42.290232] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:03.716 [2024-07-12 21:28:42.290324] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3565519 ] 00:06:03.716 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.716 [2024-07-12 21:28:42.358643] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.716 [2024-07-12 21:28:42.426604] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:03.716 [2024-07-12 21:28:42.426716] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.659 21:28:43 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:04.659 21:28:43 -- common/autotest_common.sh@852 -- # return 0 00:06:04.660 21:28:43 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:04.660 21:28:43 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:04.660 21:28:43 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:04.660 21:28:43 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:04.660 21:28:43 -- common/autotest_common.sh@10 -- # set +x 00:06:04.660 21:28:43 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # IFS== 00:06:04.660 21:28:43 -- accel/accel.sh@64 -- # read -r opc module 00:06:04.660 21:28:43 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:04.660 21:28:43 -- accel/accel.sh@67 -- # killprocess 3565519 00:06:04.660 21:28:43 -- common/autotest_common.sh@926 -- # '[' -z 3565519 ']' 00:06:04.660 21:28:43 -- common/autotest_common.sh@930 -- # kill -0 3565519 00:06:04.660 21:28:43 -- common/autotest_common.sh@931 -- # uname 00:06:04.660 21:28:43 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:04.660 21:28:43 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3565519 00:06:04.660 21:28:43 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:04.660 21:28:43 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:04.660 21:28:43 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3565519' 00:06:04.660 killing process with pid 3565519 00:06:04.660 21:28:43 -- common/autotest_common.sh@945 -- # kill 3565519 00:06:04.660 21:28:43 -- common/autotest_common.sh@950 -- # wait 3565519 00:06:04.919 21:28:43 -- accel/accel.sh@68 -- # trap - ERR 00:06:04.919 21:28:43 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:04.919 21:28:43 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:04.919 21:28:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:04.919 21:28:43 -- common/autotest_common.sh@10 -- # set +x 00:06:04.919 21:28:43 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:04.919 21:28:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:04.919 21:28:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:04.919 21:28:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:04.919 21:28:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:04.919 21:28:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:04.919 21:28:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:04.919 21:28:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:04.919 21:28:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:04.919 21:28:43 -- accel/accel.sh@42 -- # jq -r . 00:06:04.919 21:28:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.919 21:28:43 -- common/autotest_common.sh@10 -- # set +x 00:06:04.919 21:28:43 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:04.919 21:28:43 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:04.919 21:28:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:04.919 21:28:43 -- common/autotest_common.sh@10 -- # set +x 00:06:04.919 ************************************ 00:06:04.919 START TEST accel_missing_filename 00:06:04.919 ************************************ 00:06:04.919 21:28:43 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:04.919 21:28:43 -- common/autotest_common.sh@640 -- # local es=0 00:06:04.919 21:28:43 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:04.919 21:28:43 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:04.919 21:28:43 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:04.919 21:28:43 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:04.919 21:28:43 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:04.919 21:28:43 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:04.919 21:28:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:04.919 21:28:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:04.919 21:28:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:04.919 21:28:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:04.919 21:28:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:04.919 21:28:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:04.919 21:28:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:04.919 21:28:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:04.919 21:28:43 -- accel/accel.sh@42 -- # jq -r . 00:06:04.919 [2024-07-12 21:28:43.564941] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:04.919 [2024-07-12 21:28:43.565036] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3565703 ] 00:06:04.919 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.919 [2024-07-12 21:28:43.637289] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.178 [2024-07-12 21:28:43.708018] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.178 [2024-07-12 21:28:43.747811] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:05.178 [2024-07-12 21:28:43.808045] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:05.178 A filename is required. 00:06:05.178 21:28:43 -- common/autotest_common.sh@643 -- # es=234 00:06:05.178 21:28:43 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:05.178 21:28:43 -- common/autotest_common.sh@652 -- # es=106 00:06:05.179 21:28:43 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:05.179 21:28:43 -- common/autotest_common.sh@660 -- # es=1 00:06:05.179 21:28:43 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:05.179 00:06:05.179 real 0m0.332s 00:06:05.179 user 0m0.242s 00:06:05.179 sys 0m0.129s 00:06:05.179 21:28:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.179 21:28:43 -- common/autotest_common.sh@10 -- # set +x 00:06:05.179 ************************************ 00:06:05.179 END TEST accel_missing_filename 00:06:05.179 ************************************ 00:06:05.179 21:28:43 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:05.179 21:28:43 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:05.179 21:28:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:05.179 21:28:43 -- common/autotest_common.sh@10 -- # set +x 00:06:05.179 ************************************ 00:06:05.179 START TEST accel_compress_verify 00:06:05.179 ************************************ 00:06:05.179 21:28:43 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:05.179 21:28:43 -- common/autotest_common.sh@640 -- # local es=0 00:06:05.179 21:28:43 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:05.179 21:28:43 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:05.179 21:28:43 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:05.179 21:28:43 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:05.179 21:28:43 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:05.179 21:28:43 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:05.179 21:28:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:05.179 21:28:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.179 21:28:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.179 21:28:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.179 21:28:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.179 21:28:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.179 21:28:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.179 21:28:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.179 21:28:43 -- accel/accel.sh@42 -- # jq -r . 00:06:05.179 [2024-07-12 21:28:43.939662] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:05.179 [2024-07-12 21:28:43.939750] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3565840 ] 00:06:05.438 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.438 [2024-07-12 21:28:44.010565] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.438 [2024-07-12 21:28:44.078347] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.438 [2024-07-12 21:28:44.118275] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:05.438 [2024-07-12 21:28:44.178646] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:05.698 00:06:05.698 Compression does not support the verify option, aborting. 00:06:05.698 21:28:44 -- common/autotest_common.sh@643 -- # es=161 00:06:05.698 21:28:44 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:05.698 21:28:44 -- common/autotest_common.sh@652 -- # es=33 00:06:05.698 21:28:44 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:05.698 21:28:44 -- common/autotest_common.sh@660 -- # es=1 00:06:05.698 21:28:44 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:05.698 00:06:05.698 real 0m0.328s 00:06:05.698 user 0m0.246s 00:06:05.698 sys 0m0.121s 00:06:05.698 21:28:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.698 21:28:44 -- common/autotest_common.sh@10 -- # set +x 00:06:05.698 ************************************ 00:06:05.698 END TEST accel_compress_verify 00:06:05.698 ************************************ 00:06:05.698 21:28:44 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:05.698 21:28:44 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:05.698 21:28:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:05.698 21:28:44 -- common/autotest_common.sh@10 -- # set +x 00:06:05.698 ************************************ 00:06:05.698 START TEST accel_wrong_workload 00:06:05.698 ************************************ 00:06:05.698 21:28:44 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:05.698 21:28:44 -- common/autotest_common.sh@640 -- # local es=0 00:06:05.698 21:28:44 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:05.698 21:28:44 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:05.698 21:28:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:05.698 21:28:44 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:05.698 21:28:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:05.698 21:28:44 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:05.698 21:28:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:05.698 21:28:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.698 21:28:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.698 21:28:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.698 21:28:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.698 21:28:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.699 21:28:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.699 21:28:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.699 21:28:44 -- accel/accel.sh@42 -- # jq -r . 00:06:05.699 Unsupported workload type: foobar 00:06:05.699 [2024-07-12 21:28:44.303979] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:05.699 accel_perf options: 00:06:05.699 [-h help message] 00:06:05.699 [-q queue depth per core] 00:06:05.699 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:05.699 [-T number of threads per core 00:06:05.699 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:05.699 [-t time in seconds] 00:06:05.699 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:05.699 [ dif_verify, , dif_generate, dif_generate_copy 00:06:05.699 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:05.699 [-l for compress/decompress workloads, name of uncompressed input file 00:06:05.699 [-S for crc32c workload, use this seed value (default 0) 00:06:05.699 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:05.699 [-f for fill workload, use this BYTE value (default 255) 00:06:05.699 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:05.699 [-y verify result if this switch is on] 00:06:05.699 [-a tasks to allocate per core (default: same value as -q)] 00:06:05.699 Can be used to spread operations across a wider range of memory. 00:06:05.699 21:28:44 -- common/autotest_common.sh@643 -- # es=1 00:06:05.699 21:28:44 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:05.699 21:28:44 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:05.699 21:28:44 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:05.699 00:06:05.699 real 0m0.023s 00:06:05.699 user 0m0.010s 00:06:05.699 sys 0m0.012s 00:06:05.699 21:28:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.699 21:28:44 -- common/autotest_common.sh@10 -- # set +x 00:06:05.699 ************************************ 00:06:05.699 END TEST accel_wrong_workload 00:06:05.699 ************************************ 00:06:05.699 Error: writing output failed: Broken pipe 00:06:05.699 21:28:44 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:05.699 21:28:44 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:05.699 21:28:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:05.699 21:28:44 -- common/autotest_common.sh@10 -- # set +x 00:06:05.699 ************************************ 00:06:05.699 START TEST accel_negative_buffers 00:06:05.699 ************************************ 00:06:05.699 21:28:44 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:05.699 21:28:44 -- common/autotest_common.sh@640 -- # local es=0 00:06:05.699 21:28:44 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:05.699 21:28:44 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:05.699 21:28:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:05.699 21:28:44 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:05.699 21:28:44 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:05.699 21:28:44 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:05.699 21:28:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:05.699 21:28:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.699 21:28:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.699 21:28:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.699 21:28:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.699 21:28:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.699 21:28:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.699 21:28:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.699 21:28:44 -- accel/accel.sh@42 -- # jq -r . 00:06:05.699 -x option must be non-negative. 00:06:05.699 [2024-07-12 21:28:44.379155] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:05.699 accel_perf options: 00:06:05.699 [-h help message] 00:06:05.699 [-q queue depth per core] 00:06:05.699 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:05.699 [-T number of threads per core 00:06:05.699 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:05.699 [-t time in seconds] 00:06:05.699 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:05.699 [ dif_verify, , dif_generate, dif_generate_copy 00:06:05.699 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:05.699 [-l for compress/decompress workloads, name of uncompressed input file 00:06:05.699 [-S for crc32c workload, use this seed value (default 0) 00:06:05.699 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:05.699 [-f for fill workload, use this BYTE value (default 255) 00:06:05.699 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:05.699 [-y verify result if this switch is on] 00:06:05.699 [-a tasks to allocate per core (default: same value as -q)] 00:06:05.699 Can be used to spread operations across a wider range of memory. 00:06:05.699 21:28:44 -- common/autotest_common.sh@643 -- # es=1 00:06:05.699 21:28:44 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:05.699 21:28:44 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:05.699 21:28:44 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:05.699 00:06:05.699 real 0m0.029s 00:06:05.699 user 0m0.012s 00:06:05.699 sys 0m0.016s 00:06:05.699 21:28:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:05.699 21:28:44 -- common/autotest_common.sh@10 -- # set +x 00:06:05.699 ************************************ 00:06:05.699 END TEST accel_negative_buffers 00:06:05.699 ************************************ 00:06:05.699 Error: writing output failed: Broken pipe 00:06:05.699 21:28:44 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:05.699 21:28:44 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:05.699 21:28:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:05.699 21:28:44 -- common/autotest_common.sh@10 -- # set +x 00:06:05.699 ************************************ 00:06:05.699 START TEST accel_crc32c 00:06:05.699 ************************************ 00:06:05.699 21:28:44 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:05.699 21:28:44 -- accel/accel.sh@16 -- # local accel_opc 00:06:05.699 21:28:44 -- accel/accel.sh@17 -- # local accel_module 00:06:05.699 21:28:44 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:05.699 21:28:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:05.699 21:28:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.699 21:28:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.699 21:28:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.699 21:28:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.699 21:28:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.699 21:28:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.699 21:28:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.699 21:28:44 -- accel/accel.sh@42 -- # jq -r . 00:06:05.699 [2024-07-12 21:28:44.449761] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:05.699 [2024-07-12 21:28:44.449847] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3565907 ] 00:06:05.959 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.959 [2024-07-12 21:28:44.520017] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.959 [2024-07-12 21:28:44.592450] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.341 21:28:45 -- accel/accel.sh@18 -- # out=' 00:06:07.341 SPDK Configuration: 00:06:07.341 Core mask: 0x1 00:06:07.341 00:06:07.341 Accel Perf Configuration: 00:06:07.341 Workload Type: crc32c 00:06:07.341 CRC-32C seed: 32 00:06:07.341 Transfer size: 4096 bytes 00:06:07.341 Vector count 1 00:06:07.341 Module: software 00:06:07.341 Queue depth: 32 00:06:07.341 Allocate depth: 32 00:06:07.341 # threads/core: 1 00:06:07.341 Run time: 1 seconds 00:06:07.341 Verify: Yes 00:06:07.341 00:06:07.341 Running for 1 seconds... 00:06:07.341 00:06:07.341 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:07.341 ------------------------------------------------------------------------------------ 00:06:07.341 0,0 842976/s 3292 MiB/s 0 0 00:06:07.341 ==================================================================================== 00:06:07.341 Total 842976/s 3292 MiB/s 0 0' 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:07.341 21:28:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:07.341 21:28:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:07.341 21:28:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:07.341 21:28:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:07.341 21:28:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:07.341 21:28:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:07.341 21:28:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:07.341 21:28:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:07.341 21:28:45 -- accel/accel.sh@42 -- # jq -r . 00:06:07.341 [2024-07-12 21:28:45.781406] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:07.341 [2024-07-12 21:28:45.781505] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3566173 ] 00:06:07.341 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.341 [2024-07-12 21:28:45.851207] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.341 [2024-07-12 21:28:45.917869] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val= 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val= 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=0x1 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val= 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val= 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=crc32c 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=32 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val= 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=software 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@23 -- # accel_module=software 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=32 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=32 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=1 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val=Yes 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val= 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:07.341 21:28:45 -- accel/accel.sh@21 -- # val= 00:06:07.341 21:28:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # IFS=: 00:06:07.341 21:28:45 -- accel/accel.sh@20 -- # read -r var val 00:06:08.364 21:28:47 -- accel/accel.sh@21 -- # val= 00:06:08.364 21:28:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # IFS=: 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # read -r var val 00:06:08.364 21:28:47 -- accel/accel.sh@21 -- # val= 00:06:08.364 21:28:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # IFS=: 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # read -r var val 00:06:08.364 21:28:47 -- accel/accel.sh@21 -- # val= 00:06:08.364 21:28:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # IFS=: 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # read -r var val 00:06:08.364 21:28:47 -- accel/accel.sh@21 -- # val= 00:06:08.364 21:28:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # IFS=: 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # read -r var val 00:06:08.364 21:28:47 -- accel/accel.sh@21 -- # val= 00:06:08.364 21:28:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # IFS=: 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # read -r var val 00:06:08.364 21:28:47 -- accel/accel.sh@21 -- # val= 00:06:08.364 21:28:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # IFS=: 00:06:08.364 21:28:47 -- accel/accel.sh@20 -- # read -r var val 00:06:08.364 21:28:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:08.364 21:28:47 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:08.364 21:28:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:08.364 00:06:08.364 real 0m2.658s 00:06:08.364 user 0m2.396s 00:06:08.364 sys 0m0.262s 00:06:08.364 21:28:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.364 21:28:47 -- common/autotest_common.sh@10 -- # set +x 00:06:08.364 ************************************ 00:06:08.364 END TEST accel_crc32c 00:06:08.364 ************************************ 00:06:08.364 21:28:47 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:08.364 21:28:47 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:08.364 21:28:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:08.364 21:28:47 -- common/autotest_common.sh@10 -- # set +x 00:06:08.364 ************************************ 00:06:08.364 START TEST accel_crc32c_C2 00:06:08.364 ************************************ 00:06:08.364 21:28:47 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:08.364 21:28:47 -- accel/accel.sh@16 -- # local accel_opc 00:06:08.364 21:28:47 -- accel/accel.sh@17 -- # local accel_module 00:06:08.364 21:28:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:08.364 21:28:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:08.364 21:28:47 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.364 21:28:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.364 21:28:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.364 21:28:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.364 21:28:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.364 21:28:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.364 21:28:47 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.364 21:28:47 -- accel/accel.sh@42 -- # jq -r . 00:06:08.624 [2024-07-12 21:28:47.152338] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:08.624 [2024-07-12 21:28:47.152419] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3566460 ] 00:06:08.624 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.624 [2024-07-12 21:28:47.221995] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.624 [2024-07-12 21:28:47.289695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.002 21:28:48 -- accel/accel.sh@18 -- # out=' 00:06:10.002 SPDK Configuration: 00:06:10.002 Core mask: 0x1 00:06:10.002 00:06:10.002 Accel Perf Configuration: 00:06:10.002 Workload Type: crc32c 00:06:10.002 CRC-32C seed: 0 00:06:10.002 Transfer size: 4096 bytes 00:06:10.002 Vector count 2 00:06:10.002 Module: software 00:06:10.002 Queue depth: 32 00:06:10.002 Allocate depth: 32 00:06:10.002 # threads/core: 1 00:06:10.002 Run time: 1 seconds 00:06:10.002 Verify: Yes 00:06:10.002 00:06:10.002 Running for 1 seconds... 00:06:10.002 00:06:10.002 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:10.002 ------------------------------------------------------------------------------------ 00:06:10.002 0,0 617664/s 4825 MiB/s 0 0 00:06:10.002 ==================================================================================== 00:06:10.002 Total 617664/s 2412 MiB/s 0 0' 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:10.002 21:28:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:10.002 21:28:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.002 21:28:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.002 21:28:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.002 21:28:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.002 21:28:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.002 21:28:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.002 21:28:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.002 21:28:48 -- accel/accel.sh@42 -- # jq -r . 00:06:10.002 [2024-07-12 21:28:48.479485] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:10.002 [2024-07-12 21:28:48.479574] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3566732 ] 00:06:10.002 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.002 [2024-07-12 21:28:48.549060] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.002 [2024-07-12 21:28:48.615190] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val= 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val= 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val=0x1 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val= 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val= 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val=crc32c 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val=0 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val= 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val=software 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@23 -- # accel_module=software 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val=32 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.002 21:28:48 -- accel/accel.sh@21 -- # val=32 00:06:10.002 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.002 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.003 21:28:48 -- accel/accel.sh@21 -- # val=1 00:06:10.003 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.003 21:28:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:10.003 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.003 21:28:48 -- accel/accel.sh@21 -- # val=Yes 00:06:10.003 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.003 21:28:48 -- accel/accel.sh@21 -- # val= 00:06:10.003 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:10.003 21:28:48 -- accel/accel.sh@21 -- # val= 00:06:10.003 21:28:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # IFS=: 00:06:10.003 21:28:48 -- accel/accel.sh@20 -- # read -r var val 00:06:11.380 21:28:49 -- accel/accel.sh@21 -- # val= 00:06:11.380 21:28:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # IFS=: 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # read -r var val 00:06:11.380 21:28:49 -- accel/accel.sh@21 -- # val= 00:06:11.380 21:28:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # IFS=: 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # read -r var val 00:06:11.380 21:28:49 -- accel/accel.sh@21 -- # val= 00:06:11.380 21:28:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # IFS=: 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # read -r var val 00:06:11.380 21:28:49 -- accel/accel.sh@21 -- # val= 00:06:11.380 21:28:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # IFS=: 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # read -r var val 00:06:11.380 21:28:49 -- accel/accel.sh@21 -- # val= 00:06:11.380 21:28:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # IFS=: 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # read -r var val 00:06:11.380 21:28:49 -- accel/accel.sh@21 -- # val= 00:06:11.380 21:28:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # IFS=: 00:06:11.380 21:28:49 -- accel/accel.sh@20 -- # read -r var val 00:06:11.380 21:28:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:11.380 21:28:49 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:11.380 21:28:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:11.380 00:06:11.380 real 0m2.655s 00:06:11.380 user 0m2.401s 00:06:11.380 sys 0m0.253s 00:06:11.380 21:28:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.380 21:28:49 -- common/autotest_common.sh@10 -- # set +x 00:06:11.380 ************************************ 00:06:11.380 END TEST accel_crc32c_C2 00:06:11.380 ************************************ 00:06:11.380 21:28:49 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:11.380 21:28:49 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:11.380 21:28:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.380 21:28:49 -- common/autotest_common.sh@10 -- # set +x 00:06:11.380 ************************************ 00:06:11.380 START TEST accel_copy 00:06:11.380 ************************************ 00:06:11.380 21:28:49 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:11.380 21:28:49 -- accel/accel.sh@16 -- # local accel_opc 00:06:11.380 21:28:49 -- accel/accel.sh@17 -- # local accel_module 00:06:11.380 21:28:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:11.380 21:28:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:11.380 21:28:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.380 21:28:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.380 21:28:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.380 21:28:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.380 21:28:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.380 21:28:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.380 21:28:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.380 21:28:49 -- accel/accel.sh@42 -- # jq -r . 00:06:11.380 [2024-07-12 21:28:49.852867] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:11.380 [2024-07-12 21:28:49.852956] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3566959 ] 00:06:11.380 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.380 [2024-07-12 21:28:49.924975] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.380 [2024-07-12 21:28:49.993186] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.757 21:28:51 -- accel/accel.sh@18 -- # out=' 00:06:12.757 SPDK Configuration: 00:06:12.757 Core mask: 0x1 00:06:12.757 00:06:12.757 Accel Perf Configuration: 00:06:12.757 Workload Type: copy 00:06:12.757 Transfer size: 4096 bytes 00:06:12.757 Vector count 1 00:06:12.757 Module: software 00:06:12.757 Queue depth: 32 00:06:12.757 Allocate depth: 32 00:06:12.757 # threads/core: 1 00:06:12.757 Run time: 1 seconds 00:06:12.757 Verify: Yes 00:06:12.757 00:06:12.757 Running for 1 seconds... 00:06:12.757 00:06:12.757 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:12.757 ------------------------------------------------------------------------------------ 00:06:12.757 0,0 553952/s 2163 MiB/s 0 0 00:06:12.757 ==================================================================================== 00:06:12.757 Total 553952/s 2163 MiB/s 0 0' 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:12.757 21:28:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:12.757 21:28:51 -- accel/accel.sh@12 -- # build_accel_config 00:06:12.757 21:28:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:12.757 21:28:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:12.757 21:28:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:12.757 21:28:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:12.757 21:28:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:12.757 21:28:51 -- accel/accel.sh@41 -- # local IFS=, 00:06:12.757 21:28:51 -- accel/accel.sh@42 -- # jq -r . 00:06:12.757 [2024-07-12 21:28:51.182031] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:12.757 [2024-07-12 21:28:51.182124] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3567130 ] 00:06:12.757 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.757 [2024-07-12 21:28:51.252091] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.757 [2024-07-12 21:28:51.316764] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val= 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val= 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val=0x1 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val= 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val= 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val=copy 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val= 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val=software 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@23 -- # accel_module=software 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val=32 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val=32 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val=1 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val=Yes 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val= 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:12.757 21:28:51 -- accel/accel.sh@21 -- # val= 00:06:12.757 21:28:51 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # IFS=: 00:06:12.757 21:28:51 -- accel/accel.sh@20 -- # read -r var val 00:06:14.135 21:28:52 -- accel/accel.sh@21 -- # val= 00:06:14.135 21:28:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # IFS=: 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # read -r var val 00:06:14.135 21:28:52 -- accel/accel.sh@21 -- # val= 00:06:14.135 21:28:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # IFS=: 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # read -r var val 00:06:14.135 21:28:52 -- accel/accel.sh@21 -- # val= 00:06:14.135 21:28:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # IFS=: 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # read -r var val 00:06:14.135 21:28:52 -- accel/accel.sh@21 -- # val= 00:06:14.135 21:28:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # IFS=: 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # read -r var val 00:06:14.135 21:28:52 -- accel/accel.sh@21 -- # val= 00:06:14.135 21:28:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # IFS=: 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # read -r var val 00:06:14.135 21:28:52 -- accel/accel.sh@21 -- # val= 00:06:14.135 21:28:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # IFS=: 00:06:14.135 21:28:52 -- accel/accel.sh@20 -- # read -r var val 00:06:14.135 21:28:52 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:14.135 21:28:52 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:14.135 21:28:52 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:14.135 00:06:14.135 real 0m2.657s 00:06:14.135 user 0m2.399s 00:06:14.135 sys 0m0.257s 00:06:14.135 21:28:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.135 21:28:52 -- common/autotest_common.sh@10 -- # set +x 00:06:14.135 ************************************ 00:06:14.135 END TEST accel_copy 00:06:14.135 ************************************ 00:06:14.135 21:28:52 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:14.135 21:28:52 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:14.135 21:28:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:14.135 21:28:52 -- common/autotest_common.sh@10 -- # set +x 00:06:14.135 ************************************ 00:06:14.135 START TEST accel_fill 00:06:14.135 ************************************ 00:06:14.135 21:28:52 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:14.135 21:28:52 -- accel/accel.sh@16 -- # local accel_opc 00:06:14.135 21:28:52 -- accel/accel.sh@17 -- # local accel_module 00:06:14.135 21:28:52 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:14.135 21:28:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:14.135 21:28:52 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.135 21:28:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.135 21:28:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.135 21:28:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.135 21:28:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.135 21:28:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.135 21:28:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.135 21:28:52 -- accel/accel.sh@42 -- # jq -r . 00:06:14.135 [2024-07-12 21:28:52.553241] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:14.135 [2024-07-12 21:28:52.553334] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3567340 ] 00:06:14.135 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.135 [2024-07-12 21:28:52.623817] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.135 [2024-07-12 21:28:52.691997] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.513 21:28:53 -- accel/accel.sh@18 -- # out=' 00:06:15.513 SPDK Configuration: 00:06:15.513 Core mask: 0x1 00:06:15.513 00:06:15.513 Accel Perf Configuration: 00:06:15.513 Workload Type: fill 00:06:15.513 Fill pattern: 0x80 00:06:15.513 Transfer size: 4096 bytes 00:06:15.513 Vector count 1 00:06:15.513 Module: software 00:06:15.513 Queue depth: 64 00:06:15.513 Allocate depth: 64 00:06:15.513 # threads/core: 1 00:06:15.513 Run time: 1 seconds 00:06:15.513 Verify: Yes 00:06:15.513 00:06:15.513 Running for 1 seconds... 00:06:15.513 00:06:15.513 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:15.513 ------------------------------------------------------------------------------------ 00:06:15.513 0,0 980480/s 3830 MiB/s 0 0 00:06:15.513 ==================================================================================== 00:06:15.513 Total 980480/s 3830 MiB/s 0 0' 00:06:15.513 21:28:53 -- accel/accel.sh@20 -- # IFS=: 00:06:15.513 21:28:53 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:53 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:15.514 21:28:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:15.514 21:28:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:15.514 21:28:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:15.514 21:28:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:15.514 21:28:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:15.514 21:28:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:15.514 21:28:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:15.514 21:28:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:15.514 21:28:53 -- accel/accel.sh@42 -- # jq -r . 00:06:15.514 [2024-07-12 21:28:53.880209] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:15.514 [2024-07-12 21:28:53.880297] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3567598 ] 00:06:15.514 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.514 [2024-07-12 21:28:53.948308] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.514 [2024-07-12 21:28:54.014761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val= 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val= 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=0x1 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val= 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val= 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=fill 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=0x80 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val= 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=software 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@23 -- # accel_module=software 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=64 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=64 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=1 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val=Yes 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val= 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:15.514 21:28:54 -- accel/accel.sh@21 -- # val= 00:06:15.514 21:28:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # IFS=: 00:06:15.514 21:28:54 -- accel/accel.sh@20 -- # read -r var val 00:06:16.451 21:28:55 -- accel/accel.sh@21 -- # val= 00:06:16.451 21:28:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # IFS=: 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # read -r var val 00:06:16.451 21:28:55 -- accel/accel.sh@21 -- # val= 00:06:16.451 21:28:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # IFS=: 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # read -r var val 00:06:16.451 21:28:55 -- accel/accel.sh@21 -- # val= 00:06:16.451 21:28:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # IFS=: 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # read -r var val 00:06:16.451 21:28:55 -- accel/accel.sh@21 -- # val= 00:06:16.451 21:28:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # IFS=: 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # read -r var val 00:06:16.451 21:28:55 -- accel/accel.sh@21 -- # val= 00:06:16.451 21:28:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.451 21:28:55 -- accel/accel.sh@20 -- # IFS=: 00:06:16.452 21:28:55 -- accel/accel.sh@20 -- # read -r var val 00:06:16.452 21:28:55 -- accel/accel.sh@21 -- # val= 00:06:16.452 21:28:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.452 21:28:55 -- accel/accel.sh@20 -- # IFS=: 00:06:16.452 21:28:55 -- accel/accel.sh@20 -- # read -r var val 00:06:16.452 21:28:55 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:16.452 21:28:55 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:16.452 21:28:55 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:16.452 00:06:16.452 real 0m2.652s 00:06:16.452 user 0m2.399s 00:06:16.452 sys 0m0.251s 00:06:16.452 21:28:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.452 21:28:55 -- common/autotest_common.sh@10 -- # set +x 00:06:16.452 ************************************ 00:06:16.452 END TEST accel_fill 00:06:16.452 ************************************ 00:06:16.452 21:28:55 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:16.452 21:28:55 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:16.452 21:28:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:16.452 21:28:55 -- common/autotest_common.sh@10 -- # set +x 00:06:16.452 ************************************ 00:06:16.452 START TEST accel_copy_crc32c 00:06:16.452 ************************************ 00:06:16.452 21:28:55 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:16.452 21:28:55 -- accel/accel.sh@16 -- # local accel_opc 00:06:16.452 21:28:55 -- accel/accel.sh@17 -- # local accel_module 00:06:16.452 21:28:55 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:16.452 21:28:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:16.452 21:28:55 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.452 21:28:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:16.452 21:28:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.452 21:28:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.452 21:28:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:16.452 21:28:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:16.452 21:28:55 -- accel/accel.sh@41 -- # local IFS=, 00:06:16.452 21:28:55 -- accel/accel.sh@42 -- # jq -r . 00:06:16.711 [2024-07-12 21:28:55.248073] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:16.711 [2024-07-12 21:28:55.248149] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3567879 ] 00:06:16.711 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.711 [2024-07-12 21:28:55.316897] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.711 [2024-07-12 21:28:55.384449] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.093 21:28:56 -- accel/accel.sh@18 -- # out=' 00:06:18.093 SPDK Configuration: 00:06:18.093 Core mask: 0x1 00:06:18.093 00:06:18.093 Accel Perf Configuration: 00:06:18.093 Workload Type: copy_crc32c 00:06:18.093 CRC-32C seed: 0 00:06:18.093 Vector size: 4096 bytes 00:06:18.093 Transfer size: 4096 bytes 00:06:18.093 Vector count 1 00:06:18.093 Module: software 00:06:18.093 Queue depth: 32 00:06:18.093 Allocate depth: 32 00:06:18.093 # threads/core: 1 00:06:18.093 Run time: 1 seconds 00:06:18.093 Verify: Yes 00:06:18.093 00:06:18.093 Running for 1 seconds... 00:06:18.093 00:06:18.093 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:18.093 ------------------------------------------------------------------------------------ 00:06:18.093 0,0 427808/s 1671 MiB/s 0 0 00:06:18.093 ==================================================================================== 00:06:18.093 Total 427808/s 1671 MiB/s 0 0' 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:18.093 21:28:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:18.093 21:28:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.093 21:28:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.093 21:28:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.093 21:28:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.093 21:28:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.093 21:28:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.093 21:28:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.093 21:28:56 -- accel/accel.sh@42 -- # jq -r . 00:06:18.093 [2024-07-12 21:28:56.573973] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:18.093 [2024-07-12 21:28:56.574061] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3568168 ] 00:06:18.093 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.093 [2024-07-12 21:28:56.643756] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.093 [2024-07-12 21:28:56.709899] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val= 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val= 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val=0x1 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val= 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val= 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val=0 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val= 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val=software 00:06:18.093 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.093 21:28:56 -- accel/accel.sh@23 -- # accel_module=software 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.093 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.093 21:28:56 -- accel/accel.sh@21 -- # val=32 00:06:18.094 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.094 21:28:56 -- accel/accel.sh@21 -- # val=32 00:06:18.094 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.094 21:28:56 -- accel/accel.sh@21 -- # val=1 00:06:18.094 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.094 21:28:56 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:18.094 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.094 21:28:56 -- accel/accel.sh@21 -- # val=Yes 00:06:18.094 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.094 21:28:56 -- accel/accel.sh@21 -- # val= 00:06:18.094 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:18.094 21:28:56 -- accel/accel.sh@21 -- # val= 00:06:18.094 21:28:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # IFS=: 00:06:18.094 21:28:56 -- accel/accel.sh@20 -- # read -r var val 00:06:19.473 21:28:57 -- accel/accel.sh@21 -- # val= 00:06:19.473 21:28:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # IFS=: 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # read -r var val 00:06:19.473 21:28:57 -- accel/accel.sh@21 -- # val= 00:06:19.473 21:28:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # IFS=: 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # read -r var val 00:06:19.473 21:28:57 -- accel/accel.sh@21 -- # val= 00:06:19.473 21:28:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # IFS=: 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # read -r var val 00:06:19.473 21:28:57 -- accel/accel.sh@21 -- # val= 00:06:19.473 21:28:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # IFS=: 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # read -r var val 00:06:19.473 21:28:57 -- accel/accel.sh@21 -- # val= 00:06:19.473 21:28:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # IFS=: 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # read -r var val 00:06:19.473 21:28:57 -- accel/accel.sh@21 -- # val= 00:06:19.473 21:28:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # IFS=: 00:06:19.473 21:28:57 -- accel/accel.sh@20 -- # read -r var val 00:06:19.473 21:28:57 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:19.473 21:28:57 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:19.473 21:28:57 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:19.473 00:06:19.473 real 0m2.652s 00:06:19.473 user 0m2.398s 00:06:19.473 sys 0m0.253s 00:06:19.473 21:28:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.473 21:28:57 -- common/autotest_common.sh@10 -- # set +x 00:06:19.473 ************************************ 00:06:19.473 END TEST accel_copy_crc32c 00:06:19.473 ************************************ 00:06:19.473 21:28:57 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:19.473 21:28:57 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:19.473 21:28:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.473 21:28:57 -- common/autotest_common.sh@10 -- # set +x 00:06:19.473 ************************************ 00:06:19.473 START TEST accel_copy_crc32c_C2 00:06:19.473 ************************************ 00:06:19.473 21:28:57 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:19.473 21:28:57 -- accel/accel.sh@16 -- # local accel_opc 00:06:19.473 21:28:57 -- accel/accel.sh@17 -- # local accel_module 00:06:19.473 21:28:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:19.473 21:28:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:19.473 21:28:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:19.473 21:28:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:19.473 21:28:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:19.473 21:28:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:19.473 21:28:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:19.473 21:28:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:19.473 21:28:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:19.473 21:28:57 -- accel/accel.sh@42 -- # jq -r . 00:06:19.473 [2024-07-12 21:28:57.944742] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:19.473 [2024-07-12 21:28:57.944832] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3568451 ] 00:06:19.473 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.473 [2024-07-12 21:28:58.015562] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.473 [2024-07-12 21:28:58.083329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.852 21:28:59 -- accel/accel.sh@18 -- # out=' 00:06:20.852 SPDK Configuration: 00:06:20.852 Core mask: 0x1 00:06:20.852 00:06:20.852 Accel Perf Configuration: 00:06:20.852 Workload Type: copy_crc32c 00:06:20.852 CRC-32C seed: 0 00:06:20.852 Vector size: 4096 bytes 00:06:20.852 Transfer size: 8192 bytes 00:06:20.852 Vector count 2 00:06:20.852 Module: software 00:06:20.852 Queue depth: 32 00:06:20.852 Allocate depth: 32 00:06:20.852 # threads/core: 1 00:06:20.852 Run time: 1 seconds 00:06:20.852 Verify: Yes 00:06:20.852 00:06:20.852 Running for 1 seconds... 00:06:20.852 00:06:20.852 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:20.852 ------------------------------------------------------------------------------------ 00:06:20.852 0,0 302080/s 2360 MiB/s 0 0 00:06:20.852 ==================================================================================== 00:06:20.852 Total 302080/s 1180 MiB/s 0 0' 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.852 21:28:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:20.852 21:28:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:20.852 21:28:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:20.852 21:28:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:20.852 21:28:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.852 21:28:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.852 21:28:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:20.852 21:28:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:20.852 21:28:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:20.852 21:28:59 -- accel/accel.sh@42 -- # jq -r . 00:06:20.852 [2024-07-12 21:28:59.272678] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:20.852 [2024-07-12 21:28:59.272768] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3568722 ] 00:06:20.852 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.852 [2024-07-12 21:28:59.340700] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.852 [2024-07-12 21:28:59.406712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.852 21:28:59 -- accel/accel.sh@21 -- # val= 00:06:20.852 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.852 21:28:59 -- accel/accel.sh@21 -- # val= 00:06:20.852 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.852 21:28:59 -- accel/accel.sh@21 -- # val=0x1 00:06:20.852 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.852 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.852 21:28:59 -- accel/accel.sh@21 -- # val= 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val= 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val=0 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val= 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val=software 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@23 -- # accel_module=software 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val=32 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val=32 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val=1 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val=Yes 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val= 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:20.853 21:28:59 -- accel/accel.sh@21 -- # val= 00:06:20.853 21:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # IFS=: 00:06:20.853 21:28:59 -- accel/accel.sh@20 -- # read -r var val 00:06:22.231 21:29:00 -- accel/accel.sh@21 -- # val= 00:06:22.231 21:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # IFS=: 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # read -r var val 00:06:22.232 21:29:00 -- accel/accel.sh@21 -- # val= 00:06:22.232 21:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # IFS=: 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # read -r var val 00:06:22.232 21:29:00 -- accel/accel.sh@21 -- # val= 00:06:22.232 21:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # IFS=: 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # read -r var val 00:06:22.232 21:29:00 -- accel/accel.sh@21 -- # val= 00:06:22.232 21:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # IFS=: 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # read -r var val 00:06:22.232 21:29:00 -- accel/accel.sh@21 -- # val= 00:06:22.232 21:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # IFS=: 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # read -r var val 00:06:22.232 21:29:00 -- accel/accel.sh@21 -- # val= 00:06:22.232 21:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # IFS=: 00:06:22.232 21:29:00 -- accel/accel.sh@20 -- # read -r var val 00:06:22.232 21:29:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:22.232 21:29:00 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:22.232 21:29:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:22.232 00:06:22.232 real 0m2.655s 00:06:22.232 user 0m2.393s 00:06:22.232 sys 0m0.261s 00:06:22.232 21:29:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:22.232 21:29:00 -- common/autotest_common.sh@10 -- # set +x 00:06:22.232 ************************************ 00:06:22.232 END TEST accel_copy_crc32c_C2 00:06:22.232 ************************************ 00:06:22.232 21:29:00 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:22.232 21:29:00 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:22.232 21:29:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:22.232 21:29:00 -- common/autotest_common.sh@10 -- # set +x 00:06:22.232 ************************************ 00:06:22.232 START TEST accel_dualcast 00:06:22.232 ************************************ 00:06:22.232 21:29:00 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:22.232 21:29:00 -- accel/accel.sh@16 -- # local accel_opc 00:06:22.232 21:29:00 -- accel/accel.sh@17 -- # local accel_module 00:06:22.232 21:29:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:22.232 21:29:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:22.232 21:29:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.232 21:29:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.232 21:29:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.232 21:29:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.232 21:29:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.232 21:29:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.232 21:29:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.232 21:29:00 -- accel/accel.sh@42 -- # jq -r . 00:06:22.232 [2024-07-12 21:29:00.644384] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:22.232 [2024-07-12 21:29:00.644526] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3568936 ] 00:06:22.232 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.232 [2024-07-12 21:29:00.716512] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.232 [2024-07-12 21:29:00.786596] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.609 21:29:01 -- accel/accel.sh@18 -- # out=' 00:06:23.609 SPDK Configuration: 00:06:23.609 Core mask: 0x1 00:06:23.609 00:06:23.609 Accel Perf Configuration: 00:06:23.609 Workload Type: dualcast 00:06:23.609 Transfer size: 4096 bytes 00:06:23.609 Vector count 1 00:06:23.609 Module: software 00:06:23.609 Queue depth: 32 00:06:23.609 Allocate depth: 32 00:06:23.609 # threads/core: 1 00:06:23.609 Run time: 1 seconds 00:06:23.609 Verify: Yes 00:06:23.609 00:06:23.609 Running for 1 seconds... 00:06:23.609 00:06:23.609 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:23.609 ------------------------------------------------------------------------------------ 00:06:23.609 0,0 668128/s 2609 MiB/s 0 0 00:06:23.609 ==================================================================================== 00:06:23.609 Total 668128/s 2609 MiB/s 0 0' 00:06:23.609 21:29:01 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:01 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:23.609 21:29:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:23.609 21:29:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.609 21:29:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.609 21:29:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.609 21:29:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.609 21:29:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.609 21:29:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.609 21:29:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.609 21:29:01 -- accel/accel.sh@42 -- # jq -r . 00:06:23.609 [2024-07-12 21:29:01.978108] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:23.609 [2024-07-12 21:29:01.978193] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3569099 ] 00:06:23.609 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.609 [2024-07-12 21:29:02.048262] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.609 [2024-07-12 21:29:02.115097] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val= 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val= 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val=0x1 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val= 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val= 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val=dualcast 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val= 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val=software 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@23 -- # accel_module=software 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val=32 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val=32 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.609 21:29:02 -- accel/accel.sh@21 -- # val=1 00:06:23.609 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.609 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.610 21:29:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:23.610 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.610 21:29:02 -- accel/accel.sh@21 -- # val=Yes 00:06:23.610 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.610 21:29:02 -- accel/accel.sh@21 -- # val= 00:06:23.610 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:23.610 21:29:02 -- accel/accel.sh@21 -- # val= 00:06:23.610 21:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # IFS=: 00:06:23.610 21:29:02 -- accel/accel.sh@20 -- # read -r var val 00:06:24.545 21:29:03 -- accel/accel.sh@21 -- # val= 00:06:24.545 21:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.545 21:29:03 -- accel/accel.sh@20 -- # IFS=: 00:06:24.545 21:29:03 -- accel/accel.sh@20 -- # read -r var val 00:06:24.545 21:29:03 -- accel/accel.sh@21 -- # val= 00:06:24.545 21:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.545 21:29:03 -- accel/accel.sh@20 -- # IFS=: 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # read -r var val 00:06:24.546 21:29:03 -- accel/accel.sh@21 -- # val= 00:06:24.546 21:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # IFS=: 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # read -r var val 00:06:24.546 21:29:03 -- accel/accel.sh@21 -- # val= 00:06:24.546 21:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # IFS=: 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # read -r var val 00:06:24.546 21:29:03 -- accel/accel.sh@21 -- # val= 00:06:24.546 21:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # IFS=: 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # read -r var val 00:06:24.546 21:29:03 -- accel/accel.sh@21 -- # val= 00:06:24.546 21:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # IFS=: 00:06:24.546 21:29:03 -- accel/accel.sh@20 -- # read -r var val 00:06:24.546 21:29:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:24.546 21:29:03 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:24.546 21:29:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:24.546 00:06:24.546 real 0m2.662s 00:06:24.546 user 0m2.408s 00:06:24.546 sys 0m0.252s 00:06:24.546 21:29:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:24.546 21:29:03 -- common/autotest_common.sh@10 -- # set +x 00:06:24.546 ************************************ 00:06:24.546 END TEST accel_dualcast 00:06:24.546 ************************************ 00:06:24.546 21:29:03 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:24.546 21:29:03 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:24.546 21:29:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:24.546 21:29:03 -- common/autotest_common.sh@10 -- # set +x 00:06:24.546 ************************************ 00:06:24.546 START TEST accel_compare 00:06:24.546 ************************************ 00:06:24.546 21:29:03 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:24.546 21:29:03 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.546 21:29:03 -- accel/accel.sh@17 -- # local accel_module 00:06:24.804 21:29:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:24.804 21:29:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:24.804 21:29:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.804 21:29:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.804 21:29:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.804 21:29:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.804 21:29:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.804 21:29:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.804 21:29:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.804 21:29:03 -- accel/accel.sh@42 -- # jq -r . 00:06:24.804 [2024-07-12 21:29:03.347883] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:24.804 [2024-07-12 21:29:03.347973] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3569313 ] 00:06:24.804 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.804 [2024-07-12 21:29:03.418674] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.804 [2024-07-12 21:29:03.486565] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.181 21:29:04 -- accel/accel.sh@18 -- # out=' 00:06:26.181 SPDK Configuration: 00:06:26.181 Core mask: 0x1 00:06:26.181 00:06:26.181 Accel Perf Configuration: 00:06:26.181 Workload Type: compare 00:06:26.181 Transfer size: 4096 bytes 00:06:26.181 Vector count 1 00:06:26.181 Module: software 00:06:26.181 Queue depth: 32 00:06:26.181 Allocate depth: 32 00:06:26.181 # threads/core: 1 00:06:26.181 Run time: 1 seconds 00:06:26.181 Verify: Yes 00:06:26.181 00:06:26.181 Running for 1 seconds... 00:06:26.181 00:06:26.181 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:26.181 ------------------------------------------------------------------------------------ 00:06:26.181 0,0 843360/s 3294 MiB/s 0 0 00:06:26.181 ==================================================================================== 00:06:26.181 Total 843360/s 3294 MiB/s 0 0' 00:06:26.181 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.181 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.181 21:29:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:26.181 21:29:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:26.181 21:29:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.181 21:29:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.181 21:29:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.181 21:29:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.181 21:29:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.181 21:29:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.181 21:29:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.181 21:29:04 -- accel/accel.sh@42 -- # jq -r . 00:06:26.181 [2024-07-12 21:29:04.674506] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:26.181 [2024-07-12 21:29:04.674601] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3569582 ] 00:06:26.181 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.181 [2024-07-12 21:29:04.742616] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.181 [2024-07-12 21:29:04.808999] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.181 21:29:04 -- accel/accel.sh@21 -- # val= 00:06:26.181 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.181 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.181 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val= 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val=0x1 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val= 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val= 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val=compare 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val= 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val=software 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@23 -- # accel_module=software 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val=32 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val=32 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val=1 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val=Yes 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val= 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:26.182 21:29:04 -- accel/accel.sh@21 -- # val= 00:06:26.182 21:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # IFS=: 00:06:26.182 21:29:04 -- accel/accel.sh@20 -- # read -r var val 00:06:27.558 21:29:05 -- accel/accel.sh@21 -- # val= 00:06:27.558 21:29:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # IFS=: 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # read -r var val 00:06:27.558 21:29:05 -- accel/accel.sh@21 -- # val= 00:06:27.558 21:29:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # IFS=: 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # read -r var val 00:06:27.558 21:29:05 -- accel/accel.sh@21 -- # val= 00:06:27.558 21:29:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # IFS=: 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # read -r var val 00:06:27.558 21:29:05 -- accel/accel.sh@21 -- # val= 00:06:27.558 21:29:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # IFS=: 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # read -r var val 00:06:27.558 21:29:05 -- accel/accel.sh@21 -- # val= 00:06:27.558 21:29:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # IFS=: 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # read -r var val 00:06:27.558 21:29:05 -- accel/accel.sh@21 -- # val= 00:06:27.558 21:29:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # IFS=: 00:06:27.558 21:29:05 -- accel/accel.sh@20 -- # read -r var val 00:06:27.558 21:29:05 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:27.558 21:29:05 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:27.558 21:29:05 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:27.558 00:06:27.558 real 0m2.651s 00:06:27.558 user 0m2.403s 00:06:27.558 sys 0m0.247s 00:06:27.558 21:29:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.558 21:29:05 -- common/autotest_common.sh@10 -- # set +x 00:06:27.558 ************************************ 00:06:27.558 END TEST accel_compare 00:06:27.558 ************************************ 00:06:27.558 21:29:06 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:27.558 21:29:06 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:27.558 21:29:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:27.558 21:29:06 -- common/autotest_common.sh@10 -- # set +x 00:06:27.558 ************************************ 00:06:27.558 START TEST accel_xor 00:06:27.558 ************************************ 00:06:27.558 21:29:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:06:27.558 21:29:06 -- accel/accel.sh@16 -- # local accel_opc 00:06:27.558 21:29:06 -- accel/accel.sh@17 -- # local accel_module 00:06:27.558 21:29:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:27.558 21:29:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:27.558 21:29:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:27.558 21:29:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:27.558 21:29:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:27.558 21:29:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:27.558 21:29:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:27.558 21:29:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:27.558 21:29:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:27.558 21:29:06 -- accel/accel.sh@42 -- # jq -r . 00:06:27.558 [2024-07-12 21:29:06.044622] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:27.558 [2024-07-12 21:29:06.044712] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3569870 ] 00:06:27.558 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.558 [2024-07-12 21:29:06.115893] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.558 [2024-07-12 21:29:06.184111] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.932 21:29:07 -- accel/accel.sh@18 -- # out=' 00:06:28.932 SPDK Configuration: 00:06:28.932 Core mask: 0x1 00:06:28.932 00:06:28.932 Accel Perf Configuration: 00:06:28.932 Workload Type: xor 00:06:28.932 Source buffers: 2 00:06:28.932 Transfer size: 4096 bytes 00:06:28.932 Vector count 1 00:06:28.932 Module: software 00:06:28.932 Queue depth: 32 00:06:28.932 Allocate depth: 32 00:06:28.932 # threads/core: 1 00:06:28.932 Run time: 1 seconds 00:06:28.932 Verify: Yes 00:06:28.932 00:06:28.932 Running for 1 seconds... 00:06:28.932 00:06:28.932 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:28.932 ------------------------------------------------------------------------------------ 00:06:28.932 0,0 723648/s 2826 MiB/s 0 0 00:06:28.932 ==================================================================================== 00:06:28.932 Total 723648/s 2826 MiB/s 0 0' 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:28.932 21:29:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:28.932 21:29:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.932 21:29:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.932 21:29:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.932 21:29:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.932 21:29:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.932 21:29:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.932 21:29:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.932 21:29:07 -- accel/accel.sh@42 -- # jq -r . 00:06:28.932 [2024-07-12 21:29:07.373198] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:28.932 [2024-07-12 21:29:07.373297] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3570137 ] 00:06:28.932 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.932 [2024-07-12 21:29:07.442961] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.932 [2024-07-12 21:29:07.508706] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val= 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val= 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=0x1 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val= 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val= 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=xor 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=2 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val= 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=software 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@23 -- # accel_module=software 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=32 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=32 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=1 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val=Yes 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val= 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:28.932 21:29:07 -- accel/accel.sh@21 -- # val= 00:06:28.932 21:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # IFS=: 00:06:28.932 21:29:07 -- accel/accel.sh@20 -- # read -r var val 00:06:30.307 21:29:08 -- accel/accel.sh@21 -- # val= 00:06:30.307 21:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # IFS=: 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # read -r var val 00:06:30.307 21:29:08 -- accel/accel.sh@21 -- # val= 00:06:30.307 21:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # IFS=: 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # read -r var val 00:06:30.307 21:29:08 -- accel/accel.sh@21 -- # val= 00:06:30.307 21:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # IFS=: 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # read -r var val 00:06:30.307 21:29:08 -- accel/accel.sh@21 -- # val= 00:06:30.307 21:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # IFS=: 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # read -r var val 00:06:30.307 21:29:08 -- accel/accel.sh@21 -- # val= 00:06:30.307 21:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # IFS=: 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # read -r var val 00:06:30.307 21:29:08 -- accel/accel.sh@21 -- # val= 00:06:30.307 21:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # IFS=: 00:06:30.307 21:29:08 -- accel/accel.sh@20 -- # read -r var val 00:06:30.307 21:29:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:30.307 21:29:08 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:30.307 21:29:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:30.307 00:06:30.307 real 0m2.657s 00:06:30.307 user 0m2.390s 00:06:30.307 sys 0m0.264s 00:06:30.307 21:29:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.307 21:29:08 -- common/autotest_common.sh@10 -- # set +x 00:06:30.307 ************************************ 00:06:30.307 END TEST accel_xor 00:06:30.307 ************************************ 00:06:30.307 21:29:08 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:30.307 21:29:08 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:30.307 21:29:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:30.307 21:29:08 -- common/autotest_common.sh@10 -- # set +x 00:06:30.307 ************************************ 00:06:30.307 START TEST accel_xor 00:06:30.307 ************************************ 00:06:30.307 21:29:08 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:06:30.307 21:29:08 -- accel/accel.sh@16 -- # local accel_opc 00:06:30.307 21:29:08 -- accel/accel.sh@17 -- # local accel_module 00:06:30.307 21:29:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:30.307 21:29:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:30.307 21:29:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:30.307 21:29:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:30.307 21:29:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.307 21:29:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.307 21:29:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:30.307 21:29:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:30.307 21:29:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:30.307 21:29:08 -- accel/accel.sh@42 -- # jq -r . 00:06:30.307 [2024-07-12 21:29:08.744173] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:30.307 [2024-07-12 21:29:08.744260] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3570420 ] 00:06:30.307 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.307 [2024-07-12 21:29:08.814261] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.307 [2024-07-12 21:29:08.881199] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.686 21:29:10 -- accel/accel.sh@18 -- # out=' 00:06:31.686 SPDK Configuration: 00:06:31.686 Core mask: 0x1 00:06:31.686 00:06:31.686 Accel Perf Configuration: 00:06:31.686 Workload Type: xor 00:06:31.686 Source buffers: 3 00:06:31.686 Transfer size: 4096 bytes 00:06:31.686 Vector count 1 00:06:31.686 Module: software 00:06:31.686 Queue depth: 32 00:06:31.686 Allocate depth: 32 00:06:31.686 # threads/core: 1 00:06:31.686 Run time: 1 seconds 00:06:31.686 Verify: Yes 00:06:31.686 00:06:31.686 Running for 1 seconds... 00:06:31.686 00:06:31.686 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:31.686 ------------------------------------------------------------------------------------ 00:06:31.686 0,0 672480/s 2626 MiB/s 0 0 00:06:31.686 ==================================================================================== 00:06:31.686 Total 672480/s 2626 MiB/s 0 0' 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:31.686 21:29:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:31.686 21:29:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.686 21:29:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.686 21:29:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.686 21:29:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.686 21:29:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.686 21:29:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.686 21:29:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.686 21:29:10 -- accel/accel.sh@42 -- # jq -r . 00:06:31.686 [2024-07-12 21:29:10.072038] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:31.686 [2024-07-12 21:29:10.072127] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3570693 ] 00:06:31.686 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.686 [2024-07-12 21:29:10.140740] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.686 [2024-07-12 21:29:10.208981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val= 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val= 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=0x1 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val= 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val= 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=xor 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=3 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val= 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=software 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@23 -- # accel_module=software 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=32 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=32 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=1 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val=Yes 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val= 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:31.686 21:29:10 -- accel/accel.sh@21 -- # val= 00:06:31.686 21:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # IFS=: 00:06:31.686 21:29:10 -- accel/accel.sh@20 -- # read -r var val 00:06:32.623 21:29:11 -- accel/accel.sh@21 -- # val= 00:06:32.623 21:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.623 21:29:11 -- accel/accel.sh@20 -- # IFS=: 00:06:32.623 21:29:11 -- accel/accel.sh@20 -- # read -r var val 00:06:32.623 21:29:11 -- accel/accel.sh@21 -- # val= 00:06:32.623 21:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.623 21:29:11 -- accel/accel.sh@20 -- # IFS=: 00:06:32.623 21:29:11 -- accel/accel.sh@20 -- # read -r var val 00:06:32.623 21:29:11 -- accel/accel.sh@21 -- # val= 00:06:32.623 21:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.623 21:29:11 -- accel/accel.sh@20 -- # IFS=: 00:06:32.623 21:29:11 -- accel/accel.sh@20 -- # read -r var val 00:06:32.623 21:29:11 -- accel/accel.sh@21 -- # val= 00:06:32.623 21:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.623 21:29:11 -- accel/accel.sh@20 -- # IFS=: 00:06:32.624 21:29:11 -- accel/accel.sh@20 -- # read -r var val 00:06:32.624 21:29:11 -- accel/accel.sh@21 -- # val= 00:06:32.624 21:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.624 21:29:11 -- accel/accel.sh@20 -- # IFS=: 00:06:32.624 21:29:11 -- accel/accel.sh@20 -- # read -r var val 00:06:32.624 21:29:11 -- accel/accel.sh@21 -- # val= 00:06:32.624 21:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.624 21:29:11 -- accel/accel.sh@20 -- # IFS=: 00:06:32.624 21:29:11 -- accel/accel.sh@20 -- # read -r var val 00:06:32.624 21:29:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:32.624 21:29:11 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:32.624 21:29:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:32.624 00:06:32.624 real 0m2.656s 00:06:32.624 user 0m2.392s 00:06:32.624 sys 0m0.262s 00:06:32.624 21:29:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.624 21:29:11 -- common/autotest_common.sh@10 -- # set +x 00:06:32.624 ************************************ 00:06:32.624 END TEST accel_xor 00:06:32.624 ************************************ 00:06:32.883 21:29:11 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:32.883 21:29:11 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:32.883 21:29:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.883 21:29:11 -- common/autotest_common.sh@10 -- # set +x 00:06:32.883 ************************************ 00:06:32.883 START TEST accel_dif_verify 00:06:32.883 ************************************ 00:06:32.883 21:29:11 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:06:32.883 21:29:11 -- accel/accel.sh@16 -- # local accel_opc 00:06:32.883 21:29:11 -- accel/accel.sh@17 -- # local accel_module 00:06:32.883 21:29:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:32.883 21:29:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:32.883 21:29:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.883 21:29:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.883 21:29:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.883 21:29:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.883 21:29:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.883 21:29:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.883 21:29:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.883 21:29:11 -- accel/accel.sh@42 -- # jq -r . 00:06:32.883 [2024-07-12 21:29:11.444098] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:32.883 [2024-07-12 21:29:11.444180] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3570921 ] 00:06:32.883 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.883 [2024-07-12 21:29:11.513798] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.883 [2024-07-12 21:29:11.581763] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.261 21:29:12 -- accel/accel.sh@18 -- # out=' 00:06:34.261 SPDK Configuration: 00:06:34.261 Core mask: 0x1 00:06:34.261 00:06:34.261 Accel Perf Configuration: 00:06:34.261 Workload Type: dif_verify 00:06:34.261 Vector size: 4096 bytes 00:06:34.261 Transfer size: 4096 bytes 00:06:34.261 Block size: 512 bytes 00:06:34.261 Metadata size: 8 bytes 00:06:34.261 Vector count 1 00:06:34.261 Module: software 00:06:34.261 Queue depth: 32 00:06:34.261 Allocate depth: 32 00:06:34.261 # threads/core: 1 00:06:34.261 Run time: 1 seconds 00:06:34.261 Verify: No 00:06:34.261 00:06:34.261 Running for 1 seconds... 00:06:34.261 00:06:34.261 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:34.261 ------------------------------------------------------------------------------------ 00:06:34.261 0,0 240320/s 953 MiB/s 0 0 00:06:34.261 ==================================================================================== 00:06:34.261 Total 240320/s 938 MiB/s 0 0' 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:34.261 21:29:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:34.261 21:29:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:34.261 21:29:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.261 21:29:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.261 21:29:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.261 21:29:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.261 21:29:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.261 21:29:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.261 21:29:12 -- accel/accel.sh@42 -- # jq -r . 00:06:34.261 [2024-07-12 21:29:12.769950] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:34.261 [2024-07-12 21:29:12.770040] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3571085 ] 00:06:34.261 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.261 [2024-07-12 21:29:12.842178] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.261 [2024-07-12 21:29:12.910268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val= 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val= 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val=0x1 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val= 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val= 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val=dif_verify 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val= 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val=software 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@23 -- # accel_module=software 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val=32 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val=32 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val=1 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val=No 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val= 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:34.261 21:29:12 -- accel/accel.sh@21 -- # val= 00:06:34.261 21:29:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # IFS=: 00:06:34.261 21:29:12 -- accel/accel.sh@20 -- # read -r var val 00:06:35.637 21:29:14 -- accel/accel.sh@21 -- # val= 00:06:35.637 21:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.637 21:29:14 -- accel/accel.sh@20 -- # IFS=: 00:06:35.637 21:29:14 -- accel/accel.sh@20 -- # read -r var val 00:06:35.637 21:29:14 -- accel/accel.sh@21 -- # val= 00:06:35.637 21:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.637 21:29:14 -- accel/accel.sh@20 -- # IFS=: 00:06:35.637 21:29:14 -- accel/accel.sh@20 -- # read -r var val 00:06:35.637 21:29:14 -- accel/accel.sh@21 -- # val= 00:06:35.637 21:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.637 21:29:14 -- accel/accel.sh@20 -- # IFS=: 00:06:35.638 21:29:14 -- accel/accel.sh@20 -- # read -r var val 00:06:35.638 21:29:14 -- accel/accel.sh@21 -- # val= 00:06:35.638 21:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.638 21:29:14 -- accel/accel.sh@20 -- # IFS=: 00:06:35.638 21:29:14 -- accel/accel.sh@20 -- # read -r var val 00:06:35.638 21:29:14 -- accel/accel.sh@21 -- # val= 00:06:35.638 21:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.638 21:29:14 -- accel/accel.sh@20 -- # IFS=: 00:06:35.638 21:29:14 -- accel/accel.sh@20 -- # read -r var val 00:06:35.638 21:29:14 -- accel/accel.sh@21 -- # val= 00:06:35.638 21:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.638 21:29:14 -- accel/accel.sh@20 -- # IFS=: 00:06:35.638 21:29:14 -- accel/accel.sh@20 -- # read -r var val 00:06:35.638 21:29:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:35.638 21:29:14 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:35.638 21:29:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:35.638 00:06:35.638 real 0m2.656s 00:06:35.638 user 0m2.397s 00:06:35.638 sys 0m0.259s 00:06:35.638 21:29:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.638 21:29:14 -- common/autotest_common.sh@10 -- # set +x 00:06:35.638 ************************************ 00:06:35.638 END TEST accel_dif_verify 00:06:35.638 ************************************ 00:06:35.638 21:29:14 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:35.638 21:29:14 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:35.638 21:29:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:35.638 21:29:14 -- common/autotest_common.sh@10 -- # set +x 00:06:35.638 ************************************ 00:06:35.638 START TEST accel_dif_generate 00:06:35.638 ************************************ 00:06:35.638 21:29:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:06:35.638 21:29:14 -- accel/accel.sh@16 -- # local accel_opc 00:06:35.638 21:29:14 -- accel/accel.sh@17 -- # local accel_module 00:06:35.638 21:29:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:35.638 21:29:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:35.638 21:29:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.638 21:29:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.638 21:29:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.638 21:29:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.638 21:29:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.638 21:29:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.638 21:29:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.638 21:29:14 -- accel/accel.sh@42 -- # jq -r . 00:06:35.638 [2024-07-12 21:29:14.144279] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:35.638 [2024-07-12 21:29:14.144366] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3571303 ] 00:06:35.638 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.638 [2024-07-12 21:29:14.215384] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.638 [2024-07-12 21:29:14.283124] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.015 21:29:15 -- accel/accel.sh@18 -- # out=' 00:06:37.015 SPDK Configuration: 00:06:37.015 Core mask: 0x1 00:06:37.015 00:06:37.015 Accel Perf Configuration: 00:06:37.015 Workload Type: dif_generate 00:06:37.015 Vector size: 4096 bytes 00:06:37.015 Transfer size: 4096 bytes 00:06:37.015 Block size: 512 bytes 00:06:37.015 Metadata size: 8 bytes 00:06:37.015 Vector count 1 00:06:37.015 Module: software 00:06:37.015 Queue depth: 32 00:06:37.015 Allocate depth: 32 00:06:37.015 # threads/core: 1 00:06:37.015 Run time: 1 seconds 00:06:37.015 Verify: No 00:06:37.015 00:06:37.015 Running for 1 seconds... 00:06:37.015 00:06:37.015 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:37.015 ------------------------------------------------------------------------------------ 00:06:37.015 0,0 293248/s 1163 MiB/s 0 0 00:06:37.015 ==================================================================================== 00:06:37.015 Total 293248/s 1145 MiB/s 0 0' 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:37.015 21:29:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:37.015 21:29:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.015 21:29:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.015 21:29:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.015 21:29:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.015 21:29:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.015 21:29:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.015 21:29:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.015 21:29:15 -- accel/accel.sh@42 -- # jq -r . 00:06:37.015 [2024-07-12 21:29:15.471532] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:37.015 [2024-07-12 21:29:15.471624] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3571559 ] 00:06:37.015 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.015 [2024-07-12 21:29:15.541899] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.015 [2024-07-12 21:29:15.607856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val= 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val= 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val=0x1 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val= 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val= 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val=dif_generate 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val= 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val=software 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@23 -- # accel_module=software 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val=32 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val=32 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val=1 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val=No 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val= 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:37.015 21:29:15 -- accel/accel.sh@21 -- # val= 00:06:37.015 21:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # IFS=: 00:06:37.015 21:29:15 -- accel/accel.sh@20 -- # read -r var val 00:06:38.394 21:29:16 -- accel/accel.sh@21 -- # val= 00:06:38.394 21:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # IFS=: 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # read -r var val 00:06:38.394 21:29:16 -- accel/accel.sh@21 -- # val= 00:06:38.394 21:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # IFS=: 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # read -r var val 00:06:38.394 21:29:16 -- accel/accel.sh@21 -- # val= 00:06:38.394 21:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # IFS=: 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # read -r var val 00:06:38.394 21:29:16 -- accel/accel.sh@21 -- # val= 00:06:38.394 21:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # IFS=: 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # read -r var val 00:06:38.394 21:29:16 -- accel/accel.sh@21 -- # val= 00:06:38.394 21:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # IFS=: 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # read -r var val 00:06:38.394 21:29:16 -- accel/accel.sh@21 -- # val= 00:06:38.394 21:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # IFS=: 00:06:38.394 21:29:16 -- accel/accel.sh@20 -- # read -r var val 00:06:38.394 21:29:16 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:38.394 21:29:16 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:38.394 21:29:16 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:38.394 00:06:38.394 real 0m2.655s 00:06:38.394 user 0m2.392s 00:06:38.394 sys 0m0.262s 00:06:38.394 21:29:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.394 21:29:16 -- common/autotest_common.sh@10 -- # set +x 00:06:38.394 ************************************ 00:06:38.394 END TEST accel_dif_generate 00:06:38.394 ************************************ 00:06:38.394 21:29:16 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:38.394 21:29:16 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:38.394 21:29:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.394 21:29:16 -- common/autotest_common.sh@10 -- # set +x 00:06:38.394 ************************************ 00:06:38.394 START TEST accel_dif_generate_copy 00:06:38.394 ************************************ 00:06:38.394 21:29:16 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:06:38.394 21:29:16 -- accel/accel.sh@16 -- # local accel_opc 00:06:38.394 21:29:16 -- accel/accel.sh@17 -- # local accel_module 00:06:38.394 21:29:16 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:38.394 21:29:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:38.394 21:29:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.394 21:29:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.394 21:29:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.394 21:29:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.394 21:29:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.394 21:29:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.394 21:29:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.394 21:29:16 -- accel/accel.sh@42 -- # jq -r . 00:06:38.394 [2024-07-12 21:29:16.839917] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:38.394 [2024-07-12 21:29:16.840005] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3571840 ] 00:06:38.394 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.394 [2024-07-12 21:29:16.909252] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.394 [2024-07-12 21:29:16.976566] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.772 21:29:18 -- accel/accel.sh@18 -- # out=' 00:06:39.772 SPDK Configuration: 00:06:39.772 Core mask: 0x1 00:06:39.772 00:06:39.772 Accel Perf Configuration: 00:06:39.772 Workload Type: dif_generate_copy 00:06:39.772 Vector size: 4096 bytes 00:06:39.772 Transfer size: 4096 bytes 00:06:39.772 Vector count 1 00:06:39.772 Module: software 00:06:39.772 Queue depth: 32 00:06:39.772 Allocate depth: 32 00:06:39.772 # threads/core: 1 00:06:39.772 Run time: 1 seconds 00:06:39.772 Verify: No 00:06:39.772 00:06:39.772 Running for 1 seconds... 00:06:39.772 00:06:39.772 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:39.772 ------------------------------------------------------------------------------------ 00:06:39.772 0,0 224928/s 892 MiB/s 0 0 00:06:39.772 ==================================================================================== 00:06:39.772 Total 224928/s 878 MiB/s 0 0' 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:39.772 21:29:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:39.772 21:29:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.772 21:29:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.772 21:29:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.772 21:29:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.772 21:29:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.772 21:29:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.772 21:29:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.772 21:29:18 -- accel/accel.sh@42 -- # jq -r . 00:06:39.772 [2024-07-12 21:29:18.167386] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:39.772 [2024-07-12 21:29:18.167479] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3572114 ] 00:06:39.772 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.772 [2024-07-12 21:29:18.236621] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.772 [2024-07-12 21:29:18.302407] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val= 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val= 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val=0x1 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val= 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val= 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val= 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val=software 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@23 -- # accel_module=software 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val=32 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val=32 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val=1 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val=No 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val= 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:39.772 21:29:18 -- accel/accel.sh@21 -- # val= 00:06:39.772 21:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.772 21:29:18 -- accel/accel.sh@20 -- # IFS=: 00:06:39.773 21:29:18 -- accel/accel.sh@20 -- # read -r var val 00:06:40.712 21:29:19 -- accel/accel.sh@21 -- # val= 00:06:40.712 21:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # IFS=: 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # read -r var val 00:06:40.712 21:29:19 -- accel/accel.sh@21 -- # val= 00:06:40.712 21:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # IFS=: 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # read -r var val 00:06:40.712 21:29:19 -- accel/accel.sh@21 -- # val= 00:06:40.712 21:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # IFS=: 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # read -r var val 00:06:40.712 21:29:19 -- accel/accel.sh@21 -- # val= 00:06:40.712 21:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # IFS=: 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # read -r var val 00:06:40.712 21:29:19 -- accel/accel.sh@21 -- # val= 00:06:40.712 21:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # IFS=: 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # read -r var val 00:06:40.712 21:29:19 -- accel/accel.sh@21 -- # val= 00:06:40.712 21:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # IFS=: 00:06:40.712 21:29:19 -- accel/accel.sh@20 -- # read -r var val 00:06:40.712 21:29:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.712 21:29:19 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:40.712 21:29:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.712 00:06:40.712 real 0m2.652s 00:06:40.712 user 0m2.388s 00:06:40.712 sys 0m0.262s 00:06:40.712 21:29:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.712 21:29:19 -- common/autotest_common.sh@10 -- # set +x 00:06:40.712 ************************************ 00:06:40.712 END TEST accel_dif_generate_copy 00:06:40.712 ************************************ 00:06:41.096 21:29:19 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:41.096 21:29:19 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.096 21:29:19 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:41.096 21:29:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:41.096 21:29:19 -- common/autotest_common.sh@10 -- # set +x 00:06:41.096 ************************************ 00:06:41.096 START TEST accel_comp 00:06:41.096 ************************************ 00:06:41.096 21:29:19 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.096 21:29:19 -- accel/accel.sh@16 -- # local accel_opc 00:06:41.096 21:29:19 -- accel/accel.sh@17 -- # local accel_module 00:06:41.096 21:29:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.096 21:29:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:41.096 21:29:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.096 21:29:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.096 21:29:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.096 21:29:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.096 21:29:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.096 21:29:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.096 21:29:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.096 21:29:19 -- accel/accel.sh@42 -- # jq -r . 00:06:41.096 [2024-07-12 21:29:19.540227] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:41.096 [2024-07-12 21:29:19.540324] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3572395 ] 00:06:41.096 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.096 [2024-07-12 21:29:19.609376] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.096 [2024-07-12 21:29:19.675538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.474 21:29:20 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:42.474 00:06:42.474 SPDK Configuration: 00:06:42.474 Core mask: 0x1 00:06:42.474 00:06:42.474 Accel Perf Configuration: 00:06:42.474 Workload Type: compress 00:06:42.474 Transfer size: 4096 bytes 00:06:42.474 Vector count 1 00:06:42.474 Module: software 00:06:42.474 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:42.474 Queue depth: 32 00:06:42.474 Allocate depth: 32 00:06:42.474 # threads/core: 1 00:06:42.474 Run time: 1 seconds 00:06:42.474 Verify: No 00:06:42.474 00:06:42.474 Running for 1 seconds... 00:06:42.474 00:06:42.474 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:42.474 ------------------------------------------------------------------------------------ 00:06:42.474 0,0 65824/s 274 MiB/s 0 0 00:06:42.474 ==================================================================================== 00:06:42.474 Total 65824/s 257 MiB/s 0 0' 00:06:42.474 21:29:20 -- accel/accel.sh@20 -- # IFS=: 00:06:42.474 21:29:20 -- accel/accel.sh@20 -- # read -r var val 00:06:42.474 21:29:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:42.474 21:29:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:42.474 21:29:20 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.474 21:29:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.474 21:29:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.474 21:29:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.474 21:29:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.474 21:29:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.474 21:29:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.474 21:29:20 -- accel/accel.sh@42 -- # jq -r . 00:06:42.474 [2024-07-12 21:29:20.867914] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:42.474 [2024-07-12 21:29:20.867993] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3572669 ] 00:06:42.474 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.474 [2024-07-12 21:29:20.936790] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.474 [2024-07-12 21:29:21.002836] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.474 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.474 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.474 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=0x1 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=compress 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=software 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@23 -- # accel_module=software 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=32 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=32 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=1 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val=No 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:42.475 21:29:21 -- accel/accel.sh@21 -- # val= 00:06:42.475 21:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # IFS=: 00:06:42.475 21:29:21 -- accel/accel.sh@20 -- # read -r var val 00:06:43.412 21:29:22 -- accel/accel.sh@21 -- # val= 00:06:43.412 21:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # IFS=: 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # read -r var val 00:06:43.412 21:29:22 -- accel/accel.sh@21 -- # val= 00:06:43.412 21:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # IFS=: 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # read -r var val 00:06:43.412 21:29:22 -- accel/accel.sh@21 -- # val= 00:06:43.412 21:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # IFS=: 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # read -r var val 00:06:43.412 21:29:22 -- accel/accel.sh@21 -- # val= 00:06:43.412 21:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # IFS=: 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # read -r var val 00:06:43.412 21:29:22 -- accel/accel.sh@21 -- # val= 00:06:43.412 21:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # IFS=: 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # read -r var val 00:06:43.412 21:29:22 -- accel/accel.sh@21 -- # val= 00:06:43.412 21:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # IFS=: 00:06:43.412 21:29:22 -- accel/accel.sh@20 -- # read -r var val 00:06:43.412 21:29:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.412 21:29:22 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:43.412 21:29:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.412 00:06:43.412 real 0m2.657s 00:06:43.412 user 0m2.396s 00:06:43.412 sys 0m0.260s 00:06:43.412 21:29:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.412 21:29:22 -- common/autotest_common.sh@10 -- # set +x 00:06:43.412 ************************************ 00:06:43.412 END TEST accel_comp 00:06:43.412 ************************************ 00:06:43.672 21:29:22 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.672 21:29:22 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:43.672 21:29:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.672 21:29:22 -- common/autotest_common.sh@10 -- # set +x 00:06:43.672 ************************************ 00:06:43.672 START TEST accel_decomp 00:06:43.672 ************************************ 00:06:43.672 21:29:22 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.672 21:29:22 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.672 21:29:22 -- accel/accel.sh@17 -- # local accel_module 00:06:43.672 21:29:22 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.672 21:29:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.672 21:29:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.672 21:29:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.672 21:29:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.672 21:29:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.672 21:29:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.672 21:29:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.672 21:29:22 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.672 21:29:22 -- accel/accel.sh@42 -- # jq -r . 00:06:43.672 [2024-07-12 21:29:22.241881] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:43.672 [2024-07-12 21:29:22.241980] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3572925 ] 00:06:43.672 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.672 [2024-07-12 21:29:22.310787] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.672 [2024-07-12 21:29:22.378707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.050 21:29:23 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:45.050 00:06:45.050 SPDK Configuration: 00:06:45.050 Core mask: 0x1 00:06:45.050 00:06:45.050 Accel Perf Configuration: 00:06:45.050 Workload Type: decompress 00:06:45.050 Transfer size: 4096 bytes 00:06:45.050 Vector count 1 00:06:45.050 Module: software 00:06:45.050 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.050 Queue depth: 32 00:06:45.050 Allocate depth: 32 00:06:45.050 # threads/core: 1 00:06:45.050 Run time: 1 seconds 00:06:45.050 Verify: Yes 00:06:45.050 00:06:45.050 Running for 1 seconds... 00:06:45.050 00:06:45.050 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:45.050 ------------------------------------------------------------------------------------ 00:06:45.050 0,0 95616/s 176 MiB/s 0 0 00:06:45.050 ==================================================================================== 00:06:45.050 Total 95616/s 373 MiB/s 0 0' 00:06:45.050 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.050 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.050 21:29:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:45.050 21:29:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:45.050 21:29:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.050 21:29:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.050 21:29:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.050 21:29:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.050 21:29:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.050 21:29:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.050 21:29:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.050 21:29:23 -- accel/accel.sh@42 -- # jq -r . 00:06:45.050 [2024-07-12 21:29:23.570415] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:45.051 [2024-07-12 21:29:23.570513] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3573089 ] 00:06:45.051 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.051 [2024-07-12 21:29:23.641494] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.051 [2024-07-12 21:29:23.707572] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=0x1 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=decompress 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=software 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=32 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=32 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=1 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val=Yes 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:45.051 21:29:23 -- accel/accel.sh@21 -- # val= 00:06:45.051 21:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:45.051 21:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:46.430 21:29:24 -- accel/accel.sh@21 -- # val= 00:06:46.430 21:29:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # IFS=: 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # read -r var val 00:06:46.430 21:29:24 -- accel/accel.sh@21 -- # val= 00:06:46.430 21:29:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # IFS=: 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # read -r var val 00:06:46.430 21:29:24 -- accel/accel.sh@21 -- # val= 00:06:46.430 21:29:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # IFS=: 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # read -r var val 00:06:46.430 21:29:24 -- accel/accel.sh@21 -- # val= 00:06:46.430 21:29:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # IFS=: 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # read -r var val 00:06:46.430 21:29:24 -- accel/accel.sh@21 -- # val= 00:06:46.430 21:29:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # IFS=: 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # read -r var val 00:06:46.430 21:29:24 -- accel/accel.sh@21 -- # val= 00:06:46.430 21:29:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # IFS=: 00:06:46.430 21:29:24 -- accel/accel.sh@20 -- # read -r var val 00:06:46.430 21:29:24 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:46.430 21:29:24 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:46.430 21:29:24 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.430 00:06:46.430 real 0m2.661s 00:06:46.430 user 0m2.405s 00:06:46.430 sys 0m0.256s 00:06:46.430 21:29:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.430 21:29:24 -- common/autotest_common.sh@10 -- # set +x 00:06:46.430 ************************************ 00:06:46.430 END TEST accel_decomp 00:06:46.430 ************************************ 00:06:46.431 21:29:24 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:46.431 21:29:24 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:46.431 21:29:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:46.431 21:29:24 -- common/autotest_common.sh@10 -- # set +x 00:06:46.431 ************************************ 00:06:46.431 START TEST accel_decmop_full 00:06:46.431 ************************************ 00:06:46.431 21:29:24 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:46.431 21:29:24 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.431 21:29:24 -- accel/accel.sh@17 -- # local accel_module 00:06:46.431 21:29:24 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:46.431 21:29:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:46.431 21:29:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.431 21:29:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.431 21:29:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.431 21:29:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.431 21:29:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.431 21:29:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.431 21:29:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.431 21:29:24 -- accel/accel.sh@42 -- # jq -r . 00:06:46.431 [2024-07-12 21:29:24.945684] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:46.431 [2024-07-12 21:29:24.945779] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3573281 ] 00:06:46.431 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.431 [2024-07-12 21:29:25.016604] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.431 [2024-07-12 21:29:25.084452] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.810 21:29:26 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:47.810 00:06:47.810 SPDK Configuration: 00:06:47.810 Core mask: 0x1 00:06:47.810 00:06:47.810 Accel Perf Configuration: 00:06:47.810 Workload Type: decompress 00:06:47.810 Transfer size: 111250 bytes 00:06:47.810 Vector count 1 00:06:47.810 Module: software 00:06:47.810 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.810 Queue depth: 32 00:06:47.810 Allocate depth: 32 00:06:47.810 # threads/core: 1 00:06:47.810 Run time: 1 seconds 00:06:47.810 Verify: Yes 00:06:47.810 00:06:47.810 Running for 1 seconds... 00:06:47.810 00:06:47.810 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.810 ------------------------------------------------------------------------------------ 00:06:47.810 0,0 5888/s 243 MiB/s 0 0 00:06:47.810 ==================================================================================== 00:06:47.810 Total 5888/s 624 MiB/s 0 0' 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:47.810 21:29:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:47.810 21:29:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.810 21:29:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.810 21:29:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.810 21:29:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.810 21:29:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.810 21:29:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.810 21:29:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.810 21:29:26 -- accel/accel.sh@42 -- # jq -r . 00:06:47.810 [2024-07-12 21:29:26.283496] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:47.810 [2024-07-12 21:29:26.283589] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3573535 ] 00:06:47.810 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.810 [2024-07-12 21:29:26.350614] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.810 [2024-07-12 21:29:26.418844] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=0x1 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=decompress 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=software 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=32 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=32 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=1 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val=Yes 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:47.810 21:29:26 -- accel/accel.sh@21 -- # val= 00:06:47.810 21:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:47.810 21:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:49.195 21:29:27 -- accel/accel.sh@21 -- # val= 00:06:49.195 21:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:49.195 21:29:27 -- accel/accel.sh@21 -- # val= 00:06:49.195 21:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:49.195 21:29:27 -- accel/accel.sh@21 -- # val= 00:06:49.195 21:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:49.195 21:29:27 -- accel/accel.sh@21 -- # val= 00:06:49.195 21:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:49.195 21:29:27 -- accel/accel.sh@21 -- # val= 00:06:49.195 21:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:49.195 21:29:27 -- accel/accel.sh@21 -- # val= 00:06:49.195 21:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:49.195 21:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:49.195 21:29:27 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.195 21:29:27 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:49.195 21:29:27 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.195 00:06:49.195 real 0m2.674s 00:06:49.195 user 0m2.408s 00:06:49.195 sys 0m0.263s 00:06:49.195 21:29:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.195 21:29:27 -- common/autotest_common.sh@10 -- # set +x 00:06:49.195 ************************************ 00:06:49.195 END TEST accel_decmop_full 00:06:49.195 ************************************ 00:06:49.195 21:29:27 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:49.195 21:29:27 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:49.195 21:29:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:49.195 21:29:27 -- common/autotest_common.sh@10 -- # set +x 00:06:49.195 ************************************ 00:06:49.195 START TEST accel_decomp_mcore 00:06:49.195 ************************************ 00:06:49.195 21:29:27 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:49.195 21:29:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:49.195 21:29:27 -- accel/accel.sh@17 -- # local accel_module 00:06:49.195 21:29:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:49.195 21:29:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:49.195 21:29:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.195 21:29:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.195 21:29:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.195 21:29:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.195 21:29:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.195 21:29:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.195 21:29:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.195 21:29:27 -- accel/accel.sh@42 -- # jq -r . 00:06:49.195 [2024-07-12 21:29:27.667132] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:49.195 [2024-07-12 21:29:27.667226] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3573818 ] 00:06:49.195 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.195 [2024-07-12 21:29:27.739133] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:49.195 [2024-07-12 21:29:27.811337] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:49.195 [2024-07-12 21:29:27.811430] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:49.195 [2024-07-12 21:29:27.811509] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:49.195 [2024-07-12 21:29:27.811512] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.574 21:29:28 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:50.574 00:06:50.574 SPDK Configuration: 00:06:50.574 Core mask: 0xf 00:06:50.574 00:06:50.574 Accel Perf Configuration: 00:06:50.574 Workload Type: decompress 00:06:50.574 Transfer size: 4096 bytes 00:06:50.574 Vector count 1 00:06:50.574 Module: software 00:06:50.574 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:50.574 Queue depth: 32 00:06:50.574 Allocate depth: 32 00:06:50.574 # threads/core: 1 00:06:50.574 Run time: 1 seconds 00:06:50.574 Verify: Yes 00:06:50.574 00:06:50.574 Running for 1 seconds... 00:06:50.574 00:06:50.574 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:50.574 ------------------------------------------------------------------------------------ 00:06:50.574 0,0 77760/s 143 MiB/s 0 0 00:06:50.574 3,0 78528/s 144 MiB/s 0 0 00:06:50.574 2,0 78688/s 145 MiB/s 0 0 00:06:50.574 1,0 78784/s 145 MiB/s 0 0 00:06:50.574 ==================================================================================== 00:06:50.574 Total 313760/s 1225 MiB/s 0 0' 00:06:50.574 21:29:28 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:28 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:50.574 21:29:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:50.574 21:29:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.574 21:29:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.574 21:29:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.574 21:29:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.574 21:29:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.574 21:29:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.574 21:29:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.574 21:29:28 -- accel/accel.sh@42 -- # jq -r . 00:06:50.574 [2024-07-12 21:29:29.011510] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:50.574 [2024-07-12 21:29:29.011600] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3574091 ] 00:06:50.574 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.574 [2024-07-12 21:29:29.081082] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:50.574 [2024-07-12 21:29:29.150285] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.574 [2024-07-12 21:29:29.150383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:50.574 [2024-07-12 21:29:29.150465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.574 [2024-07-12 21:29:29.150467] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=0xf 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=decompress 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=software 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@23 -- # accel_module=software 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=32 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=32 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=1 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val=Yes 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:50.574 21:29:29 -- accel/accel.sh@21 -- # val= 00:06:50.574 21:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:50.574 21:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@21 -- # val= 00:06:51.954 21:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:51.954 21:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:51.954 21:29:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.954 21:29:30 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:51.954 21:29:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.954 00:06:51.954 real 0m2.693s 00:06:51.954 user 0m9.075s 00:06:51.954 sys 0m0.279s 00:06:51.954 21:29:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.954 21:29:30 -- common/autotest_common.sh@10 -- # set +x 00:06:51.954 ************************************ 00:06:51.954 END TEST accel_decomp_mcore 00:06:51.954 ************************************ 00:06:51.954 21:29:30 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:51.954 21:29:30 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:51.954 21:29:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:51.954 21:29:30 -- common/autotest_common.sh@10 -- # set +x 00:06:51.954 ************************************ 00:06:51.954 START TEST accel_decomp_full_mcore 00:06:51.954 ************************************ 00:06:51.954 21:29:30 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:51.954 21:29:30 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.954 21:29:30 -- accel/accel.sh@17 -- # local accel_module 00:06:51.954 21:29:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:51.954 21:29:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:51.954 21:29:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.954 21:29:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.954 21:29:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.954 21:29:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.954 21:29:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.954 21:29:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.954 21:29:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.954 21:29:30 -- accel/accel.sh@42 -- # jq -r . 00:06:51.954 [2024-07-12 21:29:30.406756] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:51.954 [2024-07-12 21:29:30.406848] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3574387 ] 00:06:51.954 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.954 [2024-07-12 21:29:30.475215] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:51.954 [2024-07-12 21:29:30.547220] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:51.954 [2024-07-12 21:29:30.547316] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:51.954 [2024-07-12 21:29:30.547391] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:51.954 [2024-07-12 21:29:30.547393] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.334 21:29:31 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:53.334 00:06:53.334 SPDK Configuration: 00:06:53.334 Core mask: 0xf 00:06:53.334 00:06:53.334 Accel Perf Configuration: 00:06:53.334 Workload Type: decompress 00:06:53.334 Transfer size: 111250 bytes 00:06:53.334 Vector count 1 00:06:53.334 Module: software 00:06:53.334 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:53.334 Queue depth: 32 00:06:53.334 Allocate depth: 32 00:06:53.334 # threads/core: 1 00:06:53.334 Run time: 1 seconds 00:06:53.334 Verify: Yes 00:06:53.334 00:06:53.334 Running for 1 seconds... 00:06:53.334 00:06:53.334 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.334 ------------------------------------------------------------------------------------ 00:06:53.334 0,0 5760/s 237 MiB/s 0 0 00:06:53.334 3,0 5792/s 239 MiB/s 0 0 00:06:53.334 2,0 5792/s 239 MiB/s 0 0 00:06:53.334 1,0 5792/s 239 MiB/s 0 0 00:06:53.334 ==================================================================================== 00:06:53.334 Total 23136/s 2454 MiB/s 0 0' 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:53.334 21:29:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:53.334 21:29:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.334 21:29:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.334 21:29:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.334 21:29:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.334 21:29:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.334 21:29:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.334 21:29:31 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.334 21:29:31 -- accel/accel.sh@42 -- # jq -r . 00:06:53.334 [2024-07-12 21:29:31.755068] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:53.334 [2024-07-12 21:29:31.755174] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3574656 ] 00:06:53.334 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.334 [2024-07-12 21:29:31.823401] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:53.334 [2024-07-12 21:29:31.892042] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:53.334 [2024-07-12 21:29:31.892139] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:53.334 [2024-07-12 21:29:31.892225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:53.334 [2024-07-12 21:29:31.892227] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=0xf 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=decompress 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=software 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=32 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=32 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=1 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val=Yes 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:53.334 21:29:31 -- accel/accel.sh@21 -- # val= 00:06:53.334 21:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:53.334 21:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@21 -- # val= 00:06:54.723 21:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:54.723 21:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:54.723 21:29:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:54.723 21:29:33 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:54.723 21:29:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.723 00:06:54.723 real 0m2.703s 00:06:54.723 user 0m9.140s 00:06:54.723 sys 0m0.275s 00:06:54.723 21:29:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.723 21:29:33 -- common/autotest_common.sh@10 -- # set +x 00:06:54.723 ************************************ 00:06:54.723 END TEST accel_decomp_full_mcore 00:06:54.723 ************************************ 00:06:54.723 21:29:33 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:54.723 21:29:33 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:06:54.723 21:29:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:54.723 21:29:33 -- common/autotest_common.sh@10 -- # set +x 00:06:54.723 ************************************ 00:06:54.723 START TEST accel_decomp_mthread 00:06:54.723 ************************************ 00:06:54.723 21:29:33 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:54.723 21:29:33 -- accel/accel.sh@16 -- # local accel_opc 00:06:54.723 21:29:33 -- accel/accel.sh@17 -- # local accel_module 00:06:54.723 21:29:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:54.723 21:29:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:54.723 21:29:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.723 21:29:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.723 21:29:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.723 21:29:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.723 21:29:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.723 21:29:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.723 21:29:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.723 21:29:33 -- accel/accel.sh@42 -- # jq -r . 00:06:54.723 [2024-07-12 21:29:33.158080] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:54.723 [2024-07-12 21:29:33.158175] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3574943 ] 00:06:54.723 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.723 [2024-07-12 21:29:33.226908] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.723 [2024-07-12 21:29:33.295001] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.102 21:29:34 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:56.102 00:06:56.102 SPDK Configuration: 00:06:56.102 Core mask: 0x1 00:06:56.102 00:06:56.102 Accel Perf Configuration: 00:06:56.102 Workload Type: decompress 00:06:56.102 Transfer size: 4096 bytes 00:06:56.102 Vector count 1 00:06:56.102 Module: software 00:06:56.102 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:56.102 Queue depth: 32 00:06:56.102 Allocate depth: 32 00:06:56.102 # threads/core: 2 00:06:56.102 Run time: 1 seconds 00:06:56.102 Verify: Yes 00:06:56.102 00:06:56.102 Running for 1 seconds... 00:06:56.102 00:06:56.102 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:56.102 ------------------------------------------------------------------------------------ 00:06:56.102 0,1 46880/s 86 MiB/s 0 0 00:06:56.102 0,0 46784/s 86 MiB/s 0 0 00:06:56.102 ==================================================================================== 00:06:56.102 Total 93664/s 365 MiB/s 0 0' 00:06:56.102 21:29:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.102 21:29:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.102 21:29:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.102 21:29:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.102 21:29:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.102 21:29:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.102 21:29:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.102 21:29:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.102 21:29:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.102 21:29:34 -- accel/accel.sh@42 -- # jq -r . 00:06:56.102 [2024-07-12 21:29:34.478600] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:56.102 [2024-07-12 21:29:34.478653] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3575177 ] 00:06:56.102 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.102 [2024-07-12 21:29:34.538555] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.102 [2024-07-12 21:29:34.606024] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.102 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.102 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.102 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.102 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.102 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.102 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.102 21:29:34 -- accel/accel.sh@21 -- # val=0x1 00:06:56.102 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.102 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.102 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.102 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.102 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.102 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val=decompress 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val=software 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@23 -- # accel_module=software 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val=32 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val=32 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val=2 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val=Yes 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:56.103 21:29:34 -- accel/accel.sh@21 -- # val= 00:06:56.103 21:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:56.103 21:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@21 -- # val= 00:06:57.040 21:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@21 -- # val= 00:06:57.040 21:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@21 -- # val= 00:06:57.040 21:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@21 -- # val= 00:06:57.040 21:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@21 -- # val= 00:06:57.040 21:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@21 -- # val= 00:06:57.040 21:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@21 -- # val= 00:06:57.040 21:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:57.040 21:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:57.040 21:29:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:57.040 21:29:35 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:57.040 21:29:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:57.040 00:06:57.040 real 0m2.652s 00:06:57.040 user 0m2.414s 00:06:57.040 sys 0m0.249s 00:06:57.040 21:29:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.041 21:29:35 -- common/autotest_common.sh@10 -- # set +x 00:06:57.041 ************************************ 00:06:57.041 END TEST accel_decomp_mthread 00:06:57.041 ************************************ 00:06:57.300 21:29:35 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:57.300 21:29:35 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:57.300 21:29:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:57.300 21:29:35 -- common/autotest_common.sh@10 -- # set +x 00:06:57.300 ************************************ 00:06:57.300 START TEST accel_deomp_full_mthread 00:06:57.300 ************************************ 00:06:57.300 21:29:35 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:57.300 21:29:35 -- accel/accel.sh@16 -- # local accel_opc 00:06:57.300 21:29:35 -- accel/accel.sh@17 -- # local accel_module 00:06:57.300 21:29:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:57.300 21:29:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:57.300 21:29:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.300 21:29:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.300 21:29:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.300 21:29:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.300 21:29:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.300 21:29:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.300 21:29:35 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.300 21:29:35 -- accel/accel.sh@42 -- # jq -r . 00:06:57.300 [2024-07-12 21:29:35.856398] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:57.300 [2024-07-12 21:29:35.856496] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3575379 ] 00:06:57.300 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.300 [2024-07-12 21:29:35.926598] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.300 [2024-07-12 21:29:35.995873] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.678 21:29:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:58.678 00:06:58.678 SPDK Configuration: 00:06:58.678 Core mask: 0x1 00:06:58.678 00:06:58.678 Accel Perf Configuration: 00:06:58.678 Workload Type: decompress 00:06:58.678 Transfer size: 111250 bytes 00:06:58.678 Vector count 1 00:06:58.678 Module: software 00:06:58.678 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:58.678 Queue depth: 32 00:06:58.678 Allocate depth: 32 00:06:58.678 # threads/core: 2 00:06:58.678 Run time: 1 seconds 00:06:58.678 Verify: Yes 00:06:58.678 00:06:58.678 Running for 1 seconds... 00:06:58.678 00:06:58.678 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.678 ------------------------------------------------------------------------------------ 00:06:58.678 0,1 2944/s 121 MiB/s 0 0 00:06:58.678 0,0 2944/s 121 MiB/s 0 0 00:06:58.678 ==================================================================================== 00:06:58.678 Total 5888/s 624 MiB/s 0 0' 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:58.678 21:29:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:58.678 21:29:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.678 21:29:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.678 21:29:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.678 21:29:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.678 21:29:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.678 21:29:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.678 21:29:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.678 21:29:37 -- accel/accel.sh@42 -- # jq -r . 00:06:58.678 [2024-07-12 21:29:37.209649] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:58.678 [2024-07-12 21:29:37.209740] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3575543 ] 00:06:58.678 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.678 [2024-07-12 21:29:37.281043] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.678 [2024-07-12 21:29:37.348394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val=0x1 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val=decompress 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.678 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.678 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.678 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val=software 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val=32 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val=32 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val=2 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val=Yes 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:58.679 21:29:37 -- accel/accel.sh@21 -- # val= 00:06:58.679 21:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:58.679 21:29:37 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@21 -- # val= 00:07:00.080 21:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # IFS=: 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@21 -- # val= 00:07:00.080 21:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # IFS=: 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@21 -- # val= 00:07:00.080 21:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # IFS=: 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@21 -- # val= 00:07:00.080 21:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # IFS=: 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@21 -- # val= 00:07:00.080 21:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # IFS=: 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@21 -- # val= 00:07:00.080 21:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # IFS=: 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@21 -- # val= 00:07:00.080 21:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # IFS=: 00:07:00.080 21:29:38 -- accel/accel.sh@20 -- # read -r var val 00:07:00.080 21:29:38 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:00.080 21:29:38 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:00.080 21:29:38 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.080 00:07:00.080 real 0m2.709s 00:07:00.080 user 0m2.464s 00:07:00.080 sys 0m0.253s 00:07:00.080 21:29:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.080 21:29:38 -- common/autotest_common.sh@10 -- # set +x 00:07:00.080 ************************************ 00:07:00.080 END TEST accel_deomp_full_mthread 00:07:00.080 ************************************ 00:07:00.080 21:29:38 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:00.080 21:29:38 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:00.080 21:29:38 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:00.080 21:29:38 -- accel/accel.sh@129 -- # build_accel_config 00:07:00.080 21:29:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:00.080 21:29:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.080 21:29:38 -- common/autotest_common.sh@10 -- # set +x 00:07:00.080 21:29:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.080 21:29:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.080 21:29:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.080 21:29:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.080 21:29:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.080 21:29:38 -- accel/accel.sh@42 -- # jq -r . 00:07:00.080 ************************************ 00:07:00.080 START TEST accel_dif_functional_tests 00:07:00.080 ************************************ 00:07:00.080 21:29:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:00.080 [2024-07-12 21:29:38.616708] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:00.081 [2024-07-12 21:29:38.616799] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3575812 ] 00:07:00.081 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.081 [2024-07-12 21:29:38.686678] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:00.081 [2024-07-12 21:29:38.756107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:00.081 [2024-07-12 21:29:38.756202] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.081 [2024-07-12 21:29:38.756203] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:00.081 00:07:00.081 00:07:00.081 CUnit - A unit testing framework for C - Version 2.1-3 00:07:00.081 http://cunit.sourceforge.net/ 00:07:00.081 00:07:00.081 00:07:00.081 Suite: accel_dif 00:07:00.081 Test: verify: DIF generated, GUARD check ...passed 00:07:00.081 Test: verify: DIF generated, APPTAG check ...passed 00:07:00.081 Test: verify: DIF generated, REFTAG check ...passed 00:07:00.081 Test: verify: DIF not generated, GUARD check ...[2024-07-12 21:29:38.822990] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:00.081 [2024-07-12 21:29:38.823040] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:00.081 passed 00:07:00.081 Test: verify: DIF not generated, APPTAG check ...[2024-07-12 21:29:38.823091] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:00.081 [2024-07-12 21:29:38.823111] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:00.081 passed 00:07:00.081 Test: verify: DIF not generated, REFTAG check ...[2024-07-12 21:29:38.823133] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:00.081 [2024-07-12 21:29:38.823152] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:00.081 passed 00:07:00.081 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:00.081 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-12 21:29:38.823199] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:00.081 passed 00:07:00.081 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:00.081 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:00.081 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:00.081 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-12 21:29:38.823303] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:00.081 passed 00:07:00.081 Test: generate copy: DIF generated, GUARD check ...passed 00:07:00.081 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:00.081 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:00.081 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:00.081 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:00.081 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:00.081 Test: generate copy: iovecs-len validate ...[2024-07-12 21:29:38.823481] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:00.081 passed 00:07:00.081 Test: generate copy: buffer alignment validate ...passed 00:07:00.081 00:07:00.081 Run Summary: Type Total Ran Passed Failed Inactive 00:07:00.081 suites 1 1 n/a 0 0 00:07:00.081 tests 20 20 20 0 0 00:07:00.081 asserts 204 204 204 0 n/a 00:07:00.081 00:07:00.081 Elapsed time = 0.002 seconds 00:07:00.340 00:07:00.340 real 0m0.392s 00:07:00.340 user 0m0.577s 00:07:00.340 sys 0m0.166s 00:07:00.340 21:29:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.340 21:29:38 -- common/autotest_common.sh@10 -- # set +x 00:07:00.340 ************************************ 00:07:00.340 END TEST accel_dif_functional_tests 00:07:00.340 ************************************ 00:07:00.340 00:07:00.340 real 0m56.866s 00:07:00.340 user 1m4.314s 00:07:00.340 sys 0m7.064s 00:07:00.340 21:29:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.340 21:29:39 -- common/autotest_common.sh@10 -- # set +x 00:07:00.340 ************************************ 00:07:00.340 END TEST accel 00:07:00.340 ************************************ 00:07:00.340 21:29:39 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:00.340 21:29:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:00.340 21:29:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:00.340 21:29:39 -- common/autotest_common.sh@10 -- # set +x 00:07:00.340 ************************************ 00:07:00.340 START TEST accel_rpc 00:07:00.340 ************************************ 00:07:00.340 21:29:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:00.599 * Looking for test storage... 00:07:00.599 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:00.599 21:29:39 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:00.599 21:29:39 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3576107 00:07:00.599 21:29:39 -- accel/accel_rpc.sh@15 -- # waitforlisten 3576107 00:07:00.599 21:29:39 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:00.599 21:29:39 -- common/autotest_common.sh@819 -- # '[' -z 3576107 ']' 00:07:00.599 21:29:39 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:00.599 21:29:39 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:00.599 21:29:39 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:00.600 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:00.600 21:29:39 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:00.600 21:29:39 -- common/autotest_common.sh@10 -- # set +x 00:07:00.600 [2024-07-12 21:29:39.200313] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:00.600 [2024-07-12 21:29:39.200406] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3576107 ] 00:07:00.600 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.600 [2024-07-12 21:29:39.270495] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.600 [2024-07-12 21:29:39.340973] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:00.600 [2024-07-12 21:29:39.341103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.537 21:29:40 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:01.537 21:29:40 -- common/autotest_common.sh@852 -- # return 0 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:01.537 21:29:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:01.537 21:29:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:01.537 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:01.537 ************************************ 00:07:01.537 START TEST accel_assign_opcode 00:07:01.537 ************************************ 00:07:01.537 21:29:40 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:01.537 21:29:40 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:01.537 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:01.537 [2024-07-12 21:29:40.019106] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:01.537 21:29:40 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:01.537 21:29:40 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:01.537 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:01.537 [2024-07-12 21:29:40.027114] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:01.537 21:29:40 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:01.537 21:29:40 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:01.537 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:01.537 21:29:40 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:01.537 21:29:40 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:01.537 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@42 -- # grep software 00:07:01.537 21:29:40 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:01.537 software 00:07:01.537 00:07:01.537 real 0m0.229s 00:07:01.537 user 0m0.040s 00:07:01.537 sys 0m0.017s 00:07:01.537 21:29:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.537 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:01.537 ************************************ 00:07:01.537 END TEST accel_assign_opcode 00:07:01.537 ************************************ 00:07:01.537 21:29:40 -- accel/accel_rpc.sh@55 -- # killprocess 3576107 00:07:01.537 21:29:40 -- common/autotest_common.sh@926 -- # '[' -z 3576107 ']' 00:07:01.537 21:29:40 -- common/autotest_common.sh@930 -- # kill -0 3576107 00:07:01.537 21:29:40 -- common/autotest_common.sh@931 -- # uname 00:07:01.537 21:29:40 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:01.537 21:29:40 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3576107 00:07:01.796 21:29:40 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:01.796 21:29:40 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:01.796 21:29:40 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3576107' 00:07:01.796 killing process with pid 3576107 00:07:01.796 21:29:40 -- common/autotest_common.sh@945 -- # kill 3576107 00:07:01.796 21:29:40 -- common/autotest_common.sh@950 -- # wait 3576107 00:07:02.055 00:07:02.055 real 0m1.564s 00:07:02.055 user 0m1.575s 00:07:02.055 sys 0m0.473s 00:07:02.055 21:29:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.055 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:02.055 ************************************ 00:07:02.055 END TEST accel_rpc 00:07:02.055 ************************************ 00:07:02.055 21:29:40 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:02.055 21:29:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:02.055 21:29:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:02.055 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:02.055 ************************************ 00:07:02.055 START TEST app_cmdline 00:07:02.055 ************************************ 00:07:02.055 21:29:40 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:02.055 * Looking for test storage... 00:07:02.055 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:02.055 21:29:40 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:02.055 21:29:40 -- app/cmdline.sh@17 -- # spdk_tgt_pid=3576467 00:07:02.055 21:29:40 -- app/cmdline.sh@18 -- # waitforlisten 3576467 00:07:02.055 21:29:40 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:02.055 21:29:40 -- common/autotest_common.sh@819 -- # '[' -z 3576467 ']' 00:07:02.055 21:29:40 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:02.055 21:29:40 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:02.055 21:29:40 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:02.055 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:02.055 21:29:40 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:02.055 21:29:40 -- common/autotest_common.sh@10 -- # set +x 00:07:02.055 [2024-07-12 21:29:40.812731] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:02.055 [2024-07-12 21:29:40.812802] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3576467 ] 00:07:02.314 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.314 [2024-07-12 21:29:40.877947] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.314 [2024-07-12 21:29:40.948858] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:02.314 [2024-07-12 21:29:40.948985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.882 21:29:41 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:02.882 21:29:41 -- common/autotest_common.sh@852 -- # return 0 00:07:02.882 21:29:41 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:03.141 { 00:07:03.141 "version": "SPDK v24.01.1-pre git sha1 4b94202c6", 00:07:03.141 "fields": { 00:07:03.141 "major": 24, 00:07:03.141 "minor": 1, 00:07:03.141 "patch": 1, 00:07:03.141 "suffix": "-pre", 00:07:03.141 "commit": "4b94202c6" 00:07:03.141 } 00:07:03.141 } 00:07:03.141 21:29:41 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:03.141 21:29:41 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:03.141 21:29:41 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:03.141 21:29:41 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:03.141 21:29:41 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:03.141 21:29:41 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:03.141 21:29:41 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:03.141 21:29:41 -- app/cmdline.sh@26 -- # sort 00:07:03.141 21:29:41 -- common/autotest_common.sh@10 -- # set +x 00:07:03.141 21:29:41 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:03.141 21:29:41 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:03.141 21:29:41 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:03.141 21:29:41 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:03.141 21:29:41 -- common/autotest_common.sh@640 -- # local es=0 00:07:03.141 21:29:41 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:03.141 21:29:41 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:03.141 21:29:41 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:03.141 21:29:41 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:03.141 21:29:41 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:03.141 21:29:41 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:03.141 21:29:41 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:03.141 21:29:41 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:03.141 21:29:41 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:03.141 21:29:41 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:03.400 request: 00:07:03.400 { 00:07:03.400 "method": "env_dpdk_get_mem_stats", 00:07:03.400 "req_id": 1 00:07:03.400 } 00:07:03.400 Got JSON-RPC error response 00:07:03.400 response: 00:07:03.400 { 00:07:03.400 "code": -32601, 00:07:03.400 "message": "Method not found" 00:07:03.400 } 00:07:03.400 21:29:41 -- common/autotest_common.sh@643 -- # es=1 00:07:03.400 21:29:41 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:03.400 21:29:41 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:03.400 21:29:41 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:03.400 21:29:41 -- app/cmdline.sh@1 -- # killprocess 3576467 00:07:03.400 21:29:41 -- common/autotest_common.sh@926 -- # '[' -z 3576467 ']' 00:07:03.400 21:29:41 -- common/autotest_common.sh@930 -- # kill -0 3576467 00:07:03.400 21:29:41 -- common/autotest_common.sh@931 -- # uname 00:07:03.400 21:29:41 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:03.400 21:29:41 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 3576467 00:07:03.400 21:29:42 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:03.400 21:29:42 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:03.400 21:29:42 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 3576467' 00:07:03.400 killing process with pid 3576467 00:07:03.400 21:29:42 -- common/autotest_common.sh@945 -- # kill 3576467 00:07:03.400 21:29:42 -- common/autotest_common.sh@950 -- # wait 3576467 00:07:03.660 00:07:03.660 real 0m1.635s 00:07:03.660 user 0m1.867s 00:07:03.660 sys 0m0.485s 00:07:03.660 21:29:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:03.660 21:29:42 -- common/autotest_common.sh@10 -- # set +x 00:07:03.660 ************************************ 00:07:03.660 END TEST app_cmdline 00:07:03.660 ************************************ 00:07:03.660 21:29:42 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:03.660 21:29:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:03.660 21:29:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:03.660 21:29:42 -- common/autotest_common.sh@10 -- # set +x 00:07:03.660 ************************************ 00:07:03.660 START TEST version 00:07:03.660 ************************************ 00:07:03.660 21:29:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:03.919 * Looking for test storage... 00:07:03.919 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:03.919 21:29:42 -- app/version.sh@17 -- # get_header_version major 00:07:03.919 21:29:42 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:03.919 21:29:42 -- app/version.sh@14 -- # cut -f2 00:07:03.919 21:29:42 -- app/version.sh@14 -- # tr -d '"' 00:07:03.919 21:29:42 -- app/version.sh@17 -- # major=24 00:07:03.919 21:29:42 -- app/version.sh@18 -- # get_header_version minor 00:07:03.919 21:29:42 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:03.919 21:29:42 -- app/version.sh@14 -- # cut -f2 00:07:03.919 21:29:42 -- app/version.sh@14 -- # tr -d '"' 00:07:03.919 21:29:42 -- app/version.sh@18 -- # minor=1 00:07:03.919 21:29:42 -- app/version.sh@19 -- # get_header_version patch 00:07:03.919 21:29:42 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:03.919 21:29:42 -- app/version.sh@14 -- # cut -f2 00:07:03.919 21:29:42 -- app/version.sh@14 -- # tr -d '"' 00:07:03.919 21:29:42 -- app/version.sh@19 -- # patch=1 00:07:03.919 21:29:42 -- app/version.sh@20 -- # get_header_version suffix 00:07:03.919 21:29:42 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:03.919 21:29:42 -- app/version.sh@14 -- # cut -f2 00:07:03.919 21:29:42 -- app/version.sh@14 -- # tr -d '"' 00:07:03.919 21:29:42 -- app/version.sh@20 -- # suffix=-pre 00:07:03.919 21:29:42 -- app/version.sh@22 -- # version=24.1 00:07:03.919 21:29:42 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:03.919 21:29:42 -- app/version.sh@25 -- # version=24.1.1 00:07:03.919 21:29:42 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:03.919 21:29:42 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:03.919 21:29:42 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:03.919 21:29:42 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:03.919 21:29:42 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:03.919 00:07:03.919 real 0m0.163s 00:07:03.919 user 0m0.090s 00:07:03.919 sys 0m0.118s 00:07:03.919 21:29:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:03.919 21:29:42 -- common/autotest_common.sh@10 -- # set +x 00:07:03.919 ************************************ 00:07:03.919 END TEST version 00:07:03.919 ************************************ 00:07:03.919 21:29:42 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@204 -- # uname -s 00:07:03.919 21:29:42 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:03.919 21:29:42 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:03.919 21:29:42 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:03.919 21:29:42 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:03.919 21:29:42 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:03.919 21:29:42 -- common/autotest_common.sh@10 -- # set +x 00:07:03.919 21:29:42 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:03.919 21:29:42 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:03.920 21:29:42 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:03.920 21:29:42 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:03.920 21:29:42 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:03.920 21:29:42 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:03.920 21:29:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:03.920 21:29:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:03.920 21:29:42 -- common/autotest_common.sh@10 -- # set +x 00:07:03.920 ************************************ 00:07:03.920 START TEST llvm_fuzz 00:07:03.920 ************************************ 00:07:03.920 21:29:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:04.181 * Looking for test storage... 00:07:04.181 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:04.181 21:29:42 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:04.181 21:29:42 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:04.181 21:29:42 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:04.181 21:29:42 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:04.181 21:29:42 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:04.181 21:29:42 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:04.181 21:29:42 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:04.181 21:29:42 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:04.181 21:29:42 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:04.181 21:29:42 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:04.181 21:29:42 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:04.181 21:29:42 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:04.181 21:29:42 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:04.181 21:29:42 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:04.181 21:29:42 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:04.181 21:29:42 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:04.181 21:29:42 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:04.181 21:29:42 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:04.181 21:29:42 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:04.181 21:29:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:04.181 21:29:42 -- common/autotest_common.sh@10 -- # set +x 00:07:04.181 ************************************ 00:07:04.181 START TEST nvmf_fuzz 00:07:04.181 ************************************ 00:07:04.181 21:29:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:04.181 * Looking for test storage... 00:07:04.181 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:04.181 21:29:42 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:04.181 21:29:42 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:04.181 21:29:42 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:04.181 21:29:42 -- common/autotest_common.sh@34 -- # set -e 00:07:04.181 21:29:42 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:04.182 21:29:42 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:04.182 21:29:42 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:04.182 21:29:42 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:04.182 21:29:42 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:04.182 21:29:42 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:04.182 21:29:42 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:04.182 21:29:42 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:04.182 21:29:42 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:04.182 21:29:42 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:04.182 21:29:42 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:04.182 21:29:42 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:04.182 21:29:42 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:04.182 21:29:42 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:04.182 21:29:42 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:04.182 21:29:42 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:04.182 21:29:42 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:04.182 21:29:42 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:04.182 21:29:42 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:04.182 21:29:42 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:04.182 21:29:42 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:04.182 21:29:42 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:04.182 21:29:42 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:04.182 21:29:42 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:04.182 21:29:42 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:04.182 21:29:42 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:04.182 21:29:42 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:04.182 21:29:42 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:04.182 21:29:42 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:04.182 21:29:42 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:04.182 21:29:42 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:04.182 21:29:42 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:04.182 21:29:42 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:04.182 21:29:42 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:04.182 21:29:42 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:04.182 21:29:42 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:04.182 21:29:42 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:04.182 21:29:42 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:04.182 21:29:42 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:04.182 21:29:42 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:04.182 21:29:42 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:04.182 21:29:42 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:04.182 21:29:42 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:04.182 21:29:42 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:04.182 21:29:42 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:04.182 21:29:42 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:04.182 21:29:42 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:04.182 21:29:42 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:04.182 21:29:42 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:04.182 21:29:42 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:04.182 21:29:42 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:04.182 21:29:42 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:04.182 21:29:42 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:04.182 21:29:42 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:04.182 21:29:42 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:04.182 21:29:42 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:04.182 21:29:42 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:04.182 21:29:42 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:04.182 21:29:42 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:04.182 21:29:42 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:04.182 21:29:42 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:04.182 21:29:42 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:04.182 21:29:42 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:04.182 21:29:42 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:04.182 21:29:42 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:04.182 21:29:42 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:04.182 21:29:42 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:04.182 21:29:42 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:04.182 21:29:42 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:04.182 21:29:42 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:04.182 21:29:42 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:04.182 21:29:42 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:04.182 21:29:42 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:04.182 21:29:42 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:04.182 21:29:42 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:04.182 21:29:42 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:04.182 21:29:42 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:04.182 21:29:42 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:04.182 21:29:42 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:04.182 21:29:42 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:04.182 21:29:42 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:04.182 21:29:42 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:04.182 21:29:42 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:04.182 21:29:42 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:04.182 21:29:42 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:04.182 21:29:42 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:04.182 21:29:42 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:04.182 21:29:42 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:04.182 21:29:42 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:04.182 21:29:42 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:04.182 21:29:42 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:04.182 21:29:42 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:04.182 21:29:42 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:04.182 21:29:42 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:04.182 21:29:42 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:04.182 21:29:42 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:04.182 21:29:42 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:04.182 21:29:42 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:04.182 21:29:42 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:04.182 #define SPDK_CONFIG_H 00:07:04.182 #define SPDK_CONFIG_APPS 1 00:07:04.182 #define SPDK_CONFIG_ARCH native 00:07:04.182 #undef SPDK_CONFIG_ASAN 00:07:04.182 #undef SPDK_CONFIG_AVAHI 00:07:04.182 #undef SPDK_CONFIG_CET 00:07:04.182 #define SPDK_CONFIG_COVERAGE 1 00:07:04.182 #define SPDK_CONFIG_CROSS_PREFIX 00:07:04.182 #undef SPDK_CONFIG_CRYPTO 00:07:04.182 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:04.182 #undef SPDK_CONFIG_CUSTOMOCF 00:07:04.182 #undef SPDK_CONFIG_DAOS 00:07:04.182 #define SPDK_CONFIG_DAOS_DIR 00:07:04.182 #define SPDK_CONFIG_DEBUG 1 00:07:04.182 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:04.182 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:04.182 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:04.182 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:04.182 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:04.182 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:04.182 #define SPDK_CONFIG_EXAMPLES 1 00:07:04.182 #undef SPDK_CONFIG_FC 00:07:04.182 #define SPDK_CONFIG_FC_PATH 00:07:04.182 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:04.182 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:04.182 #undef SPDK_CONFIG_FUSE 00:07:04.182 #define SPDK_CONFIG_FUZZER 1 00:07:04.182 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:04.182 #undef SPDK_CONFIG_GOLANG 00:07:04.182 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:04.182 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:04.182 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:04.182 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:04.182 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:04.182 #define SPDK_CONFIG_IDXD 1 00:07:04.182 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:04.182 #undef SPDK_CONFIG_IPSEC_MB 00:07:04.182 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:04.182 #define SPDK_CONFIG_ISAL 1 00:07:04.182 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:04.182 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:04.182 #define SPDK_CONFIG_LIBDIR 00:07:04.182 #undef SPDK_CONFIG_LTO 00:07:04.182 #define SPDK_CONFIG_MAX_LCORES 00:07:04.182 #define SPDK_CONFIG_NVME_CUSE 1 00:07:04.182 #undef SPDK_CONFIG_OCF 00:07:04.182 #define SPDK_CONFIG_OCF_PATH 00:07:04.182 #define SPDK_CONFIG_OPENSSL_PATH 00:07:04.182 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:04.182 #undef SPDK_CONFIG_PGO_USE 00:07:04.182 #define SPDK_CONFIG_PREFIX /usr/local 00:07:04.182 #undef SPDK_CONFIG_RAID5F 00:07:04.182 #undef SPDK_CONFIG_RBD 00:07:04.182 #define SPDK_CONFIG_RDMA 1 00:07:04.182 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:04.182 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:04.182 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:04.182 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:04.182 #undef SPDK_CONFIG_SHARED 00:07:04.182 #undef SPDK_CONFIG_SMA 00:07:04.182 #define SPDK_CONFIG_TESTS 1 00:07:04.182 #undef SPDK_CONFIG_TSAN 00:07:04.182 #define SPDK_CONFIG_UBLK 1 00:07:04.182 #define SPDK_CONFIG_UBSAN 1 00:07:04.182 #undef SPDK_CONFIG_UNIT_TESTS 00:07:04.182 #undef SPDK_CONFIG_URING 00:07:04.182 #define SPDK_CONFIG_URING_PATH 00:07:04.182 #undef SPDK_CONFIG_URING_ZNS 00:07:04.182 #undef SPDK_CONFIG_USDT 00:07:04.182 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:04.182 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:04.182 #define SPDK_CONFIG_VFIO_USER 1 00:07:04.183 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:04.183 #define SPDK_CONFIG_VHOST 1 00:07:04.183 #define SPDK_CONFIG_VIRTIO 1 00:07:04.183 #undef SPDK_CONFIG_VTUNE 00:07:04.183 #define SPDK_CONFIG_VTUNE_DIR 00:07:04.183 #define SPDK_CONFIG_WERROR 1 00:07:04.183 #define SPDK_CONFIG_WPDK_DIR 00:07:04.183 #undef SPDK_CONFIG_XNVME 00:07:04.183 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:04.183 21:29:42 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:04.183 21:29:42 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:04.183 21:29:42 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:04.183 21:29:42 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:04.183 21:29:42 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:04.183 21:29:42 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.183 21:29:42 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.183 21:29:42 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.183 21:29:42 -- paths/export.sh@5 -- # export PATH 00:07:04.183 21:29:42 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:04.183 21:29:42 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:04.183 21:29:42 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:04.183 21:29:42 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:04.183 21:29:42 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:04.183 21:29:42 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:04.183 21:29:42 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:04.183 21:29:42 -- pm/common@16 -- # TEST_TAG=N/A 00:07:04.183 21:29:42 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:04.183 21:29:42 -- common/autotest_common.sh@52 -- # : 1 00:07:04.183 21:29:42 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:04.183 21:29:42 -- common/autotest_common.sh@56 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:04.183 21:29:42 -- common/autotest_common.sh@58 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:04.183 21:29:42 -- common/autotest_common.sh@60 -- # : 1 00:07:04.183 21:29:42 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:04.183 21:29:42 -- common/autotest_common.sh@62 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:04.183 21:29:42 -- common/autotest_common.sh@64 -- # : 00:07:04.183 21:29:42 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:04.183 21:29:42 -- common/autotest_common.sh@66 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:04.183 21:29:42 -- common/autotest_common.sh@68 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:04.183 21:29:42 -- common/autotest_common.sh@70 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:04.183 21:29:42 -- common/autotest_common.sh@72 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:04.183 21:29:42 -- common/autotest_common.sh@74 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:04.183 21:29:42 -- common/autotest_common.sh@76 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:04.183 21:29:42 -- common/autotest_common.sh@78 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:04.183 21:29:42 -- common/autotest_common.sh@80 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:04.183 21:29:42 -- common/autotest_common.sh@82 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:04.183 21:29:42 -- common/autotest_common.sh@84 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:04.183 21:29:42 -- common/autotest_common.sh@86 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:04.183 21:29:42 -- common/autotest_common.sh@88 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:04.183 21:29:42 -- common/autotest_common.sh@90 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:04.183 21:29:42 -- common/autotest_common.sh@92 -- # : 1 00:07:04.183 21:29:42 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:04.183 21:29:42 -- common/autotest_common.sh@94 -- # : 1 00:07:04.183 21:29:42 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:04.183 21:29:42 -- common/autotest_common.sh@96 -- # : rdma 00:07:04.183 21:29:42 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:04.183 21:29:42 -- common/autotest_common.sh@98 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:04.183 21:29:42 -- common/autotest_common.sh@100 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:04.183 21:29:42 -- common/autotest_common.sh@102 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:04.183 21:29:42 -- common/autotest_common.sh@104 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:04.183 21:29:42 -- common/autotest_common.sh@106 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:04.183 21:29:42 -- common/autotest_common.sh@108 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:04.183 21:29:42 -- common/autotest_common.sh@110 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:04.183 21:29:42 -- common/autotest_common.sh@112 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:04.183 21:29:42 -- common/autotest_common.sh@114 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:04.183 21:29:42 -- common/autotest_common.sh@116 -- # : 1 00:07:04.183 21:29:42 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:04.183 21:29:42 -- common/autotest_common.sh@118 -- # : 00:07:04.183 21:29:42 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:04.183 21:29:42 -- common/autotest_common.sh@120 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:04.183 21:29:42 -- common/autotest_common.sh@122 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:04.183 21:29:42 -- common/autotest_common.sh@124 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:04.183 21:29:42 -- common/autotest_common.sh@126 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:04.183 21:29:42 -- common/autotest_common.sh@128 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:04.183 21:29:42 -- common/autotest_common.sh@130 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:04.183 21:29:42 -- common/autotest_common.sh@132 -- # : 00:07:04.183 21:29:42 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:04.183 21:29:42 -- common/autotest_common.sh@134 -- # : true 00:07:04.183 21:29:42 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:04.183 21:29:42 -- common/autotest_common.sh@136 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:04.183 21:29:42 -- common/autotest_common.sh@138 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:04.183 21:29:42 -- common/autotest_common.sh@140 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:04.183 21:29:42 -- common/autotest_common.sh@142 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:04.183 21:29:42 -- common/autotest_common.sh@144 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:04.183 21:29:42 -- common/autotest_common.sh@146 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:04.183 21:29:42 -- common/autotest_common.sh@148 -- # : 00:07:04.183 21:29:42 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:04.183 21:29:42 -- common/autotest_common.sh@150 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:04.183 21:29:42 -- common/autotest_common.sh@152 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:04.183 21:29:42 -- common/autotest_common.sh@154 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:04.183 21:29:42 -- common/autotest_common.sh@156 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:04.183 21:29:42 -- common/autotest_common.sh@158 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:04.183 21:29:42 -- common/autotest_common.sh@160 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:04.183 21:29:42 -- common/autotest_common.sh@163 -- # : 00:07:04.183 21:29:42 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:04.183 21:29:42 -- common/autotest_common.sh@165 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:04.183 21:29:42 -- common/autotest_common.sh@167 -- # : 0 00:07:04.183 21:29:42 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:04.183 21:29:42 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:04.183 21:29:42 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:04.184 21:29:42 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:04.184 21:29:42 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:04.184 21:29:42 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:04.184 21:29:42 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:04.184 21:29:42 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:04.184 21:29:42 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:04.184 21:29:42 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:04.184 21:29:42 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:04.184 21:29:42 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:04.184 21:29:42 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:04.184 21:29:42 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:04.184 21:29:42 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:04.184 21:29:42 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:04.184 21:29:42 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:04.184 21:29:42 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:04.184 21:29:42 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:04.184 21:29:42 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:04.184 21:29:42 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:04.184 21:29:42 -- common/autotest_common.sh@196 -- # cat 00:07:04.184 21:29:42 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:04.184 21:29:42 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:04.184 21:29:42 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:04.184 21:29:42 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:04.184 21:29:42 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:04.184 21:29:42 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:04.184 21:29:42 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:04.184 21:29:42 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:04.184 21:29:42 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:04.184 21:29:42 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:04.184 21:29:42 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:04.184 21:29:42 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:04.184 21:29:42 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:04.184 21:29:42 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:04.184 21:29:42 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:04.184 21:29:42 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:04.184 21:29:42 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:04.184 21:29:42 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:04.184 21:29:42 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:04.184 21:29:42 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:04.184 21:29:42 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:04.184 21:29:42 -- common/autotest_common.sh@249 -- # valgrind= 00:07:04.184 21:29:42 -- common/autotest_common.sh@255 -- # uname -s 00:07:04.184 21:29:42 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:04.184 21:29:42 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:04.184 21:29:42 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:04.184 21:29:42 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:04.184 21:29:42 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:04.184 21:29:42 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:04.184 21:29:42 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:04.184 21:29:42 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:04.184 21:29:42 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:04.184 21:29:42 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:04.184 21:29:42 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:04.184 21:29:42 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:04.184 21:29:42 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:04.184 21:29:42 -- common/autotest_common.sh@309 -- # [[ -z 3576893 ]] 00:07:04.184 21:29:42 -- common/autotest_common.sh@309 -- # kill -0 3576893 00:07:04.184 21:29:42 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:04.184 21:29:42 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:04.184 21:29:42 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:04.184 21:29:42 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:04.184 21:29:42 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:04.184 21:29:42 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:04.184 21:29:42 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:04.184 21:29:42 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:04.184 21:29:42 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.ql7VKh 00:07:04.184 21:29:42 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:04.184 21:29:42 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:04.184 21:29:42 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:04.184 21:29:42 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.ql7VKh/tests/nvmf /tmp/spdk.ql7VKh 00:07:04.184 21:29:42 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:04.184 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.184 21:29:42 -- common/autotest_common.sh@318 -- # df -T 00:07:04.184 21:29:42 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:04.184 21:29:42 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:04.184 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:04.184 21:29:42 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:07:04.184 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # avails["$mount"]=54770364416 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:07:04.184 21:29:42 -- common/autotest_common.sh@354 -- # uses["$mount"]=6971953152 00:07:04.184 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:04.184 21:29:42 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:04.184 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:07:04.184 21:29:42 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:07:04.184 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870765568 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:04.184 21:29:42 -- common/autotest_common.sh@354 -- # uses["$mount"]=393216 00:07:04.184 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:04.184 21:29:42 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:04.184 21:29:42 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:04.185 21:29:42 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:04.185 21:29:42 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:04.185 * Looking for test storage... 00:07:04.185 21:29:42 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:04.185 21:29:42 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:04.185 21:29:42 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:04.185 21:29:42 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:04.185 21:29:42 -- common/autotest_common.sh@363 -- # mount=/ 00:07:04.185 21:29:42 -- common/autotest_common.sh@365 -- # target_space=54770364416 00:07:04.185 21:29:42 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:04.185 21:29:42 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:04.185 21:29:42 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:04.185 21:29:42 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:04.185 21:29:42 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:04.185 21:29:42 -- common/autotest_common.sh@372 -- # new_size=9186545664 00:07:04.185 21:29:42 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:04.185 21:29:42 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:04.185 21:29:42 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:04.185 21:29:42 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:04.185 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:04.185 21:29:42 -- common/autotest_common.sh@380 -- # return 0 00:07:04.185 21:29:42 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:04.185 21:29:42 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:04.185 21:29:42 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:04.185 21:29:42 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:04.185 21:29:42 -- common/autotest_common.sh@1672 -- # true 00:07:04.185 21:29:42 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:04.185 21:29:42 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:04.185 21:29:42 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:04.185 21:29:42 -- common/autotest_common.sh@27 -- # exec 00:07:04.185 21:29:42 -- common/autotest_common.sh@29 -- # exec 00:07:04.185 21:29:42 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:04.185 21:29:42 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:04.185 21:29:42 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:04.185 21:29:42 -- common/autotest_common.sh@18 -- # set -x 00:07:04.185 21:29:42 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:04.185 21:29:42 -- ../common.sh@8 -- # pids=() 00:07:04.185 21:29:42 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:04.185 21:29:42 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:04.185 21:29:42 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:04.185 21:29:42 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:04.185 21:29:42 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:04.185 21:29:42 -- nvmf/run.sh@61 -- # mem_size=512 00:07:04.185 21:29:42 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:04.185 21:29:42 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:04.185 21:29:42 -- ../common.sh@69 -- # local fuzz_num=25 00:07:04.185 21:29:42 -- ../common.sh@70 -- # local time=1 00:07:04.185 21:29:42 -- ../common.sh@72 -- # (( i = 0 )) 00:07:04.185 21:29:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:04.185 21:29:42 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:04.185 21:29:42 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:04.185 21:29:42 -- nvmf/run.sh@24 -- # local timen=1 00:07:04.185 21:29:42 -- nvmf/run.sh@25 -- # local core=0x1 00:07:04.185 21:29:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:04.185 21:29:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:04.185 21:29:42 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:04.185 21:29:42 -- nvmf/run.sh@29 -- # port=4400 00:07:04.185 21:29:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:04.444 21:29:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:04.444 21:29:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:04.444 21:29:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:04.444 [2024-07-12 21:29:42.998270] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:04.444 [2024-07-12 21:29:42.998344] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3576931 ] 00:07:04.444 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.703 [2024-07-12 21:29:43.255364] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.703 [2024-07-12 21:29:43.339686] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:04.703 [2024-07-12 21:29:43.339828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.703 [2024-07-12 21:29:43.398146] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:04.703 [2024-07-12 21:29:43.414420] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:04.703 INFO: Running with entropic power schedule (0xFF, 100). 00:07:04.703 INFO: Seed: 1754238334 00:07:04.703 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:04.703 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:04.703 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:04.703 INFO: A corpus is not provided, starting from an empty corpus 00:07:04.703 #2 INITED exec/s: 0 rss: 60Mb 00:07:04.703 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:04.703 This may also happen if the target rejected all inputs we tried so far 00:07:04.703 [2024-07-12 21:29:43.481592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:04.703 [2024-07-12 21:29:43.481631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.218 NEW_FUNC[1/670]: 0x480d10 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:05.218 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:05.218 #28 NEW cov: 11472 ft: 11473 corp: 2/67b lim: 320 exec/s: 0 rss: 67Mb L: 66/66 MS: 1 InsertRepeatedBytes- 00:07:05.218 [2024-07-12 21:29:43.821861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:05.218 [2024-07-12 21:29:43.821909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.218 NEW_FUNC[1/1]: 0x12e0670 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:05.218 #31 NEW cov: 11628 ft: 12269 corp: 3/155b lim: 320 exec/s: 0 rss: 67Mb L: 88/88 MS: 3 InsertRepeatedBytes-CopyPart-CopyPart- 00:07:05.218 [2024-07-12 21:29:43.871950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:05.218 [2024-07-12 21:29:43.871981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.218 #32 NEW cov: 11634 ft: 12390 corp: 4/272b lim: 320 exec/s: 0 rss: 67Mb L: 117/117 MS: 1 InsertRepeatedBytes- 00:07:05.218 [2024-07-12 21:29:43.922043] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:05.219 [2024-07-12 21:29:43.922070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.219 #43 NEW cov: 11719 ft: 12625 corp: 5/338b lim: 320 exec/s: 0 rss: 67Mb L: 66/117 MS: 1 ShuffleBytes- 00:07:05.219 [2024-07-12 21:29:43.972154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2303030303030303 00:07:05.219 [2024-07-12 21:29:43.972183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.219 #44 NEW cov: 11719 ft: 12694 corp: 6/427b lim: 320 exec/s: 0 rss: 67Mb L: 89/117 MS: 1 InsertByte- 00:07:05.477 [2024-07-12 21:29:44.022367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:05.477 [2024-07-12 21:29:44.022395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.477 #45 NEW cov: 11719 ft: 12751 corp: 7/515b lim: 320 exec/s: 0 rss: 67Mb L: 88/117 MS: 1 CMP- DE: "\036\000\000\000"- 00:07:05.477 [2024-07-12 21:29:44.072687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:04040404 cdw11:04040404 SGL TRANSPORT DATA BLOCK TRANSPORT 0x404040404040303 00:07:05.477 [2024-07-12 21:29:44.072714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.477 [2024-07-12 21:29:44.072852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:4040404 cdw10:03030303 cdw11:03030303 00:07:05.477 [2024-07-12 21:29:44.072870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.477 #46 NEW cov: 11730 ft: 12944 corp: 8/652b lim: 320 exec/s: 0 rss: 67Mb L: 137/137 MS: 1 InsertRepeatedBytes- 00:07:05.477 [2024-07-12 21:29:44.122675] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:05.477 [2024-07-12 21:29:44.122703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.477 #47 NEW cov: 11730 ft: 13028 corp: 9/718b lim: 320 exec/s: 0 rss: 68Mb L: 66/137 MS: 1 CrossOver- 00:07:05.477 [2024-07-12 21:29:44.172837] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:05.477 [2024-07-12 21:29:44.172867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.477 #48 NEW cov: 11730 ft: 13169 corp: 10/799b lim: 320 exec/s: 0 rss: 68Mb L: 81/137 MS: 1 InsertRepeatedBytes- 00:07:05.477 [2024-07-12 21:29:44.222871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:05.477 [2024-07-12 21:29:44.222896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.477 #49 NEW cov: 11730 ft: 13214 corp: 11/886b lim: 320 exec/s: 0 rss: 68Mb L: 87/137 MS: 1 EraseBytes- 00:07:05.736 [2024-07-12 21:29:44.273158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffff31ffffff 00:07:05.736 [2024-07-12 21:29:44.273188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.736 #50 NEW cov: 11730 ft: 13239 corp: 12/1003b lim: 320 exec/s: 0 rss: 68Mb L: 117/137 MS: 1 ChangeByte- 00:07:05.736 [2024-07-12 21:29:44.323296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:05.736 [2024-07-12 21:29:44.323323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.736 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:05.736 #51 NEW cov: 11753 ft: 13287 corp: 13/1090b lim: 320 exec/s: 0 rss: 68Mb L: 87/137 MS: 1 ChangeBinInt- 00:07:05.736 [2024-07-12 21:29:44.373481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa03030303030303 00:07:05.736 [2024-07-12 21:29:44.373511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.736 #52 NEW cov: 11753 ft: 13318 corp: 14/1178b lim: 320 exec/s: 0 rss: 68Mb L: 88/137 MS: 1 CopyPart- 00:07:05.736 [2024-07-12 21:29:44.433606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:fa020303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa03030303030303 00:07:05.736 [2024-07-12 21:29:44.433635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.736 #53 NEW cov: 11753 ft: 13335 corp: 15/1266b lim: 320 exec/s: 53 rss: 68Mb L: 88/137 MS: 1 ChangeBinInt- 00:07:05.736 [2024-07-12 21:29:44.483696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:033b0303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2303030303030303 00:07:05.736 [2024-07-12 21:29:44.483733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.736 #54 NEW cov: 11753 ft: 13355 corp: 16/1355b lim: 320 exec/s: 54 rss: 68Mb L: 89/137 MS: 1 ChangeByte- 00:07:05.994 [2024-07-12 21:29:44.534014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:05.994 [2024-07-12 21:29:44.534047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.994 #55 NEW cov: 11753 ft: 13368 corp: 17/1443b lim: 320 exec/s: 55 rss: 69Mb L: 88/137 MS: 1 ChangeByte- 00:07:05.994 [2024-07-12 21:29:44.584036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:05.994 [2024-07-12 21:29:44.584063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.994 #56 NEW cov: 11753 ft: 13465 corp: 18/1531b lim: 320 exec/s: 56 rss: 69Mb L: 88/137 MS: 1 ChangeBit- 00:07:05.994 [2024-07-12 21:29:44.634226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:033b0303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x2303030303030303 00:07:05.994 [2024-07-12 21:29:44.634255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.994 #57 NEW cov: 11753 ft: 13491 corp: 19/1620b lim: 320 exec/s: 57 rss: 69Mb L: 89/137 MS: 1 ChangeBinInt- 00:07:05.994 [2024-07-12 21:29:44.684358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:05.994 [2024-07-12 21:29:44.684384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.994 #58 NEW cov: 11753 ft: 13516 corp: 20/1708b lim: 320 exec/s: 58 rss: 69Mb L: 88/137 MS: 1 ShuffleBytes- 00:07:05.994 [2024-07-12 21:29:44.734695] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:05.994 [2024-07-12 21:29:44.734721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.994 #59 NEW cov: 11753 ft: 13570 corp: 21/1774b lim: 320 exec/s: 59 rss: 69Mb L: 66/137 MS: 1 CMP- DE: "\001\000"- 00:07:06.253 [2024-07-12 21:29:44.784725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3030303 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.253 [2024-07-12 21:29:44.784755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.253 #62 NEW cov: 11753 ft: 13580 corp: 22/1876b lim: 320 exec/s: 62 rss: 69Mb L: 102/137 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:07:06.253 [2024-07-12 21:29:44.845113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:06.253 [2024-07-12 21:29:44.845142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.253 #63 NEW cov: 11753 ft: 13587 corp: 23/1942b lim: 320 exec/s: 63 rss: 69Mb L: 66/137 MS: 1 ChangeBit- 00:07:06.253 [2024-07-12 21:29:44.895072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:fa020303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.253 [2024-07-12 21:29:44.895101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.253 #64 NEW cov: 11753 ft: 13606 corp: 24/2034b lim: 320 exec/s: 64 rss: 69Mb L: 92/137 MS: 1 PersAutoDict- DE: "\036\000\000\000"- 00:07:06.253 [2024-07-12 21:29:44.955205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:fa020303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.253 [2024-07-12 21:29:44.955233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.253 #65 NEW cov: 11753 ft: 13647 corp: 25/2126b lim: 320 exec/s: 65 rss: 70Mb L: 92/137 MS: 1 ShuffleBytes- 00:07:06.253 [2024-07-12 21:29:45.015521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:06.253 [2024-07-12 21:29:45.015549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.512 #66 NEW cov: 11753 ft: 13661 corp: 26/2198b lim: 320 exec/s: 66 rss: 70Mb L: 72/137 MS: 1 CrossOver- 00:07:06.512 [2024-07-12 21:29:45.065609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.512 [2024-07-12 21:29:45.065638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.512 #67 NEW cov: 11753 ft: 13723 corp: 27/2286b lim: 320 exec/s: 67 rss: 70Mb L: 88/137 MS: 1 ChangeBit- 00:07:06.512 [2024-07-12 21:29:45.126004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:fa020303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.512 [2024-07-12 21:29:45.126032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.512 [2024-07-12 21:29:45.126185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:5 nsid:3030303 cdw10:03030303 cdw11:03230303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.512 [2024-07-12 21:29:45.126204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.512 #68 NEW cov: 11753 ft: 13876 corp: 28/2467b lim: 320 exec/s: 68 rss: 70Mb L: 181/181 MS: 1 CrossOver- 00:07:06.512 [2024-07-12 21:29:45.186048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b8b8b8b8 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb8b8b8b8b8b8b8b8 00:07:06.512 [2024-07-12 21:29:45.186077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.512 #69 NEW cov: 11753 ft: 13883 corp: 29/2533b lim: 320 exec/s: 69 rss: 70Mb L: 66/181 MS: 1 ChangeBit- 00:07:06.512 [2024-07-12 21:29:45.236284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa03030303030303 00:07:06.512 [2024-07-12 21:29:45.236314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.512 #70 NEW cov: 11753 ft: 13885 corp: 30/2621b lim: 320 exec/s: 70 rss: 70Mb L: 88/181 MS: 1 ChangeByte- 00:07:06.512 [2024-07-12 21:29:45.286557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:fd030303 cdw11:05fdfcfc SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.512 [2024-07-12 21:29:45.286585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.512 [2024-07-12 21:29:45.286742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:5 nsid:3030303 cdw10:03030303 cdw11:03230303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.512 [2024-07-12 21:29:45.286759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.771 #76 NEW cov: 11753 ft: 13889 corp: 31/2802b lim: 320 exec/s: 76 rss: 70Mb L: 181/181 MS: 1 ChangeBinInt- 00:07:06.771 [2024-07-12 21:29:45.346524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.771 [2024-07-12 21:29:45.346550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.771 #77 NEW cov: 11753 ft: 13926 corp: 32/2890b lim: 320 exec/s: 77 rss: 70Mb L: 88/181 MS: 1 ChangeByte- 00:07:06.771 [2024-07-12 21:29:45.396727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:3032303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.771 [2024-07-12 21:29:45.396757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.771 #78 NEW cov: 11753 ft: 13928 corp: 33/2979b lim: 320 exec/s: 78 rss: 70Mb L: 89/181 MS: 1 InsertByte- 00:07:06.771 [2024-07-12 21:29:45.436830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (03) qid:0 cid:4 nsid:a030303 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x303030303030303 00:07:06.771 [2024-07-12 21:29:45.436856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.771 #79 NEW cov: 11753 ft: 13945 corp: 34/3067b lim: 320 exec/s: 39 rss: 70Mb L: 88/181 MS: 1 ChangeBinInt- 00:07:06.771 #79 DONE cov: 11753 ft: 13945 corp: 34/3067b lim: 320 exec/s: 39 rss: 70Mb 00:07:06.771 ###### Recommended dictionary. ###### 00:07:06.771 "\036\000\000\000" # Uses: 1 00:07:06.771 "\001\000" # Uses: 0 00:07:06.771 ###### End of recommended dictionary. ###### 00:07:06.771 Done 79 runs in 2 second(s) 00:07:07.029 21:29:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:07.029 21:29:45 -- ../common.sh@72 -- # (( i++ )) 00:07:07.029 21:29:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:07.029 21:29:45 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:07.029 21:29:45 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:07.029 21:29:45 -- nvmf/run.sh@24 -- # local timen=1 00:07:07.029 21:29:45 -- nvmf/run.sh@25 -- # local core=0x1 00:07:07.029 21:29:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:07.029 21:29:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:07.029 21:29:45 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:07.029 21:29:45 -- nvmf/run.sh@29 -- # port=4401 00:07:07.029 21:29:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:07.029 21:29:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:07.029 21:29:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:07.029 21:29:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:07.029 [2024-07-12 21:29:45.629893] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:07.029 [2024-07-12 21:29:45.629958] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3577474 ] 00:07:07.029 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.287 [2024-07-12 21:29:45.883537] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.287 [2024-07-12 21:29:45.967306] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:07.287 [2024-07-12 21:29:45.967453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.287 [2024-07-12 21:29:46.025305] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:07.287 [2024-07-12 21:29:46.041594] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:07.288 INFO: Running with entropic power schedule (0xFF, 100). 00:07:07.288 INFO: Seed: 86273965 00:07:07.546 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:07.546 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:07.546 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:07.546 INFO: A corpus is not provided, starting from an empty corpus 00:07:07.546 #2 INITED exec/s: 0 rss: 60Mb 00:07:07.546 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:07.546 This may also happen if the target rejected all inputs we tried so far 00:07:07.546 [2024-07-12 21:29:46.086622] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:07.546 [2024-07-12 21:29:46.086835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.546 [2024-07-12 21:29:46.086864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.805 NEW_FUNC[1/671]: 0x481610 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:07.805 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:07.805 #5 NEW cov: 11553 ft: 11554 corp: 2/10b lim: 30 exec/s: 0 rss: 67Mb L: 9/9 MS: 3 ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:07:07.805 [2024-07-12 21:29:46.387313] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:07.805 [2024-07-12 21:29:46.387535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.805 [2024-07-12 21:29:46.387571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.805 #6 NEW cov: 11666 ft: 12156 corp: 3/19b lim: 30 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:07.805 [2024-07-12 21:29:46.437389] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000009b5 00:07:07.805 [2024-07-12 21:29:46.437599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.805 [2024-07-12 21:29:46.437625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.805 #7 NEW cov: 11672 ft: 12365 corp: 4/28b lim: 30 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ShuffleBytes- 00:07:07.805 [2024-07-12 21:29:46.477504] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:07.805 [2024-07-12 21:29:46.477699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.805 [2024-07-12 21:29:46.477729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.805 #8 NEW cov: 11757 ft: 12581 corp: 5/37b lim: 30 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ShuffleBytes- 00:07:07.805 [2024-07-12 21:29:46.517695] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:07.805 [2024-07-12 21:29:46.517809] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:07.805 [2024-07-12 21:29:46.517915] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:07.805 [2024-07-12 21:29:46.518018] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:07.805 [2024-07-12 21:29:46.518214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.805 [2024-07-12 21:29:46.518240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.805 [2024-07-12 21:29:46.518293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.805 [2024-07-12 21:29:46.518308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.805 [2024-07-12 21:29:46.518359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.805 [2024-07-12 21:29:46.518372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.805 [2024-07-12 21:29:46.518429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.805 [2024-07-12 21:29:46.518447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.805 #9 NEW cov: 11757 ft: 13316 corp: 6/63b lim: 30 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:07.805 [2024-07-12 21:29:46.557753] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (309976) > buf size (4096) 00:07:07.806 [2024-07-12 21:29:46.557953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:07.806 [2024-07-12 21:29:46.557979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.806 #10 NEW cov: 11780 ft: 13441 corp: 7/73b lim: 30 exec/s: 0 rss: 68Mb L: 10/26 MS: 1 InsertByte- 00:07:08.065 [2024-07-12 21:29:46.597824] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.065 [2024-07-12 21:29:46.598020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.598045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.065 #16 NEW cov: 11780 ft: 13491 corp: 8/82b lim: 30 exec/s: 0 rss: 68Mb L: 9/26 MS: 1 ChangeByte- 00:07:08.065 [2024-07-12 21:29:46.637938] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:07:08.065 [2024-07-12 21:29:46.638151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.638176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.065 #17 NEW cov: 11780 ft: 13601 corp: 9/91b lim: 30 exec/s: 0 rss: 68Mb L: 9/26 MS: 1 CMP- DE: "\005\000\000\000"- 00:07:08.065 [2024-07-12 21:29:46.678130] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.678243] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.678349] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.678563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.678588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.678641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.678655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.678705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.678718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.065 #18 NEW cov: 11780 ft: 13846 corp: 10/113b lim: 30 exec/s: 0 rss: 68Mb L: 22/26 MS: 1 EraseBytes- 00:07:08.065 [2024-07-12 21:29:46.718275] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adff 00:07:08.065 [2024-07-12 21:29:46.718397] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.718507] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.718718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.718747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.718801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.718814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.718866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.718880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.065 #19 NEW cov: 11780 ft: 13885 corp: 11/136b lim: 30 exec/s: 0 rss: 69Mb L: 23/26 MS: 1 InsertByte- 00:07:08.065 [2024-07-12 21:29:46.758380] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adff 00:07:08.065 [2024-07-12 21:29:46.758499] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.758606] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.758818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.758842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.758893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.758907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.758959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:515281ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.758973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.065 #20 NEW cov: 11780 ft: 13895 corp: 12/159b lim: 30 exec/s: 0 rss: 69Mb L: 23/26 MS: 1 ChangeBinInt- 00:07:08.065 [2024-07-12 21:29:46.798544] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adff 00:07:08.065 [2024-07-12 21:29:46.798669] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.798769] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.065 [2024-07-12 21:29:46.798866] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5cd 00:07:08.065 [2024-07-12 21:29:46.799075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.799100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.799154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.799167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.799221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:515281ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.065 [2024-07-12 21:29:46.799233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.065 [2024-07-12 21:29:46.799284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.066 [2024-07-12 21:29:46.799300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.066 #21 NEW cov: 11780 ft: 13944 corp: 13/186b lim: 30 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 CrossOver- 00:07:08.066 [2024-07-12 21:29:46.838578] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.066 [2024-07-12 21:29:46.838775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.066 [2024-07-12 21:29:46.838801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.325 #22 NEW cov: 11780 ft: 13973 corp: 14/195b lim: 30 exec/s: 0 rss: 69Mb L: 9/27 MS: 1 ShuffleBytes- 00:07:08.325 [2024-07-12 21:29:46.878747] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5ad 00:07:08.325 [2024-07-12 21:29:46.878858] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.325 [2024-07-12 21:29:46.878962] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.325 [2024-07-12 21:29:46.879060] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adb5 00:07:08.325 [2024-07-12 21:29:46.879267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.879292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:46.879345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.879358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:46.879409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ad518152 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.879422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:46.879474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.879487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.325 #23 NEW cov: 11780 ft: 14010 corp: 15/222b lim: 30 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 CrossOver- 00:07:08.325 [2024-07-12 21:29:46.918805] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5ff 00:07:08.325 [2024-07-12 21:29:46.918915] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.325 [2024-07-12 21:29:46.919117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.919142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:46.919198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.919212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.325 #24 NEW cov: 11780 ft: 14267 corp: 16/234b lim: 30 exec/s: 0 rss: 69Mb L: 12/27 MS: 1 InsertRepeatedBytes- 00:07:08.325 [2024-07-12 21:29:46.958973] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.325 [2024-07-12 21:29:46.959084] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.325 [2024-07-12 21:29:46.959187] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.325 [2024-07-12 21:29:46.959400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.959426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:46.959479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.959493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:46.959544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.959558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.325 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:08.325 #25 NEW cov: 11803 ft: 14314 corp: 17/257b lim: 30 exec/s: 0 rss: 69Mb L: 23/27 MS: 1 InsertRepeatedBytes- 00:07:08.325 [2024-07-12 21:29:46.999038] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.325 [2024-07-12 21:29:46.999248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58108 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:46.999277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.325 #26 NEW cov: 11803 ft: 14364 corp: 18/266b lim: 30 exec/s: 0 rss: 69Mb L: 9/27 MS: 1 ChangeBit- 00:07:08.325 [2024-07-12 21:29:47.029151] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000009b5 00:07:08.325 [2024-07-12 21:29:47.029576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:47.029609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:47.029662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:47.029676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:47.029730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:47.029744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.325 #27 NEW cov: 11820 ft: 14446 corp: 19/287b lim: 30 exec/s: 0 rss: 69Mb L: 21/27 MS: 1 InsertRepeatedBytes- 00:07:08.325 [2024-07-12 21:29:47.069259] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000009b5 00:07:08.325 [2024-07-12 21:29:47.069473] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (512) > len (4) 00:07:08.325 [2024-07-12 21:29:47.069687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:47.069713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:47.069768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:47.069782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.325 [2024-07-12 21:29:47.069835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.325 [2024-07-12 21:29:47.069853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.325 #28 NEW cov: 11826 ft: 14476 corp: 20/308b lim: 30 exec/s: 28 rss: 69Mb L: 21/27 MS: 1 ChangeBit- 00:07:08.585 [2024-07-12 21:29:47.109342] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002cb5 00:07:08.585 [2024-07-12 21:29:47.109573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.109598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.585 #29 NEW cov: 11826 ft: 14515 corp: 21/317b lim: 30 exec/s: 29 rss: 70Mb L: 9/27 MS: 1 EraseBytes- 00:07:08.585 [2024-07-12 21:29:47.149348] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5ff 00:07:08.585 [2024-07-12 21:29:47.149466] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.585 [2024-07-12 21:29:47.149656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.149681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.585 [2024-07-12 21:29:47.149735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.149749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.585 #30 NEW cov: 11826 ft: 14552 corp: 22/333b lim: 30 exec/s: 30 rss: 70Mb L: 16/27 MS: 1 InsertRepeatedBytes- 00:07:08.585 [2024-07-12 21:29:47.189593] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.585 [2024-07-12 21:29:47.189705] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (209924) > buf size (4096) 00:07:08.585 [2024-07-12 21:29:47.189920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58108 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.189945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.585 [2024-07-12 21:29:47.189997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cd000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.190011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.585 #31 NEW cov: 11826 ft: 14596 corp: 23/347b lim: 30 exec/s: 31 rss: 70Mb L: 14/27 MS: 1 InsertRepeatedBytes- 00:07:08.585 [2024-07-12 21:29:47.229711] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.585 [2024-07-12 21:29:47.229826] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.585 [2024-07-12 21:29:47.229924] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:07:08.585 [2024-07-12 21:29:47.230122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b583b5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.230147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.585 [2024-07-12 21:29:47.230202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.230216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.585 [2024-07-12 21:29:47.230269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff0009 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.230285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.585 #32 NEW cov: 11826 ft: 14609 corp: 24/367b lim: 30 exec/s: 32 rss: 70Mb L: 20/27 MS: 1 InsertRepeatedBytes- 00:07:08.585 [2024-07-12 21:29:47.269789] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.585 [2024-07-12 21:29:47.269996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.270020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.585 #33 NEW cov: 11826 ft: 14629 corp: 25/376b lim: 30 exec/s: 33 rss: 70Mb L: 9/27 MS: 1 ChangeByte- 00:07:08.585 [2024-07-12 21:29:47.299923] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.585 [2024-07-12 21:29:47.300036] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:08.585 [2024-07-12 21:29:47.300140] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:07:08.585 [2024-07-12 21:29:47.300334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b583b5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.585 [2024-07-12 21:29:47.300359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.586 [2024-07-12 21:29:47.300410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.586 [2024-07-12 21:29:47.300424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.586 [2024-07-12 21:29:47.300472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:06ff0009 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.586 [2024-07-12 21:29:47.300485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.586 #34 NEW cov: 11826 ft: 14635 corp: 26/396b lim: 30 exec/s: 34 rss: 70Mb L: 20/27 MS: 1 ChangeBinInt- 00:07:08.586 [2024-07-12 21:29:47.340005] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002cb5 00:07:08.586 [2024-07-12 21:29:47.340130] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:07:08.586 [2024-07-12 21:29:47.340323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.586 [2024-07-12 21:29:47.340349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.586 [2024-07-12 21:29:47.340404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.586 [2024-07-12 21:29:47.340417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.586 #35 NEW cov: 11826 ft: 14640 corp: 27/409b lim: 30 exec/s: 35 rss: 70Mb L: 13/27 MS: 1 PersAutoDict- DE: "\005\000\000\000"- 00:07:08.845 [2024-07-12 21:29:47.380142] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5ff 00:07:08.845 [2024-07-12 21:29:47.380254] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.845 [2024-07-12 21:29:47.380359] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.845 [2024-07-12 21:29:47.380468] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.845 [2024-07-12 21:29:47.380682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.380710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.845 [2024-07-12 21:29:47.380765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.380778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.845 [2024-07-12 21:29:47.380830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b583b5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.380843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.845 [2024-07-12 21:29:47.380898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.380911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.845 #36 NEW cov: 11826 ft: 14661 corp: 28/435b lim: 30 exec/s: 36 rss: 70Mb L: 26/27 MS: 1 InsertRepeatedBytes- 00:07:08.845 [2024-07-12 21:29:47.420295] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (572120) > buf size (4096) 00:07:08.845 [2024-07-12 21:29:47.420405] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:07:08.845 [2024-07-12 21:29:47.420515] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:08.845 [2024-07-12 21:29:47.420728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb50209 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.420753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.845 [2024-07-12 21:29:47.420807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.420822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.845 [2024-07-12 21:29:47.420874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00b583b5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.420887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.845 #42 NEW cov: 11826 ft: 14676 corp: 29/457b lim: 30 exec/s: 42 rss: 70Mb L: 22/27 MS: 1 CrossOver- 00:07:08.845 [2024-07-12 21:29:47.460329] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.845 [2024-07-12 21:29:47.460559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.460584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.845 #43 NEW cov: 11826 ft: 14730 corp: 30/464b lim: 30 exec/s: 43 rss: 70Mb L: 7/27 MS: 1 EraseBytes- 00:07:08.845 [2024-07-12 21:29:47.500467] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002cb5 00:07:08.845 [2024-07-12 21:29:47.500578] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:07:08.845 [2024-07-12 21:29:47.500682] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002cb5 00:07:08.845 [2024-07-12 21:29:47.500875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.845 [2024-07-12 21:29:47.500900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.845 [2024-07-12 21:29:47.500955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.500971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.846 [2024-07-12 21:29:47.501024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.501037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.846 #44 NEW cov: 11826 ft: 14745 corp: 31/483b lim: 30 exec/s: 44 rss: 70Mb L: 19/27 MS: 1 CopyPart- 00:07:08.846 [2024-07-12 21:29:47.540631] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.846 [2024-07-12 21:29:47.540760] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.846 [2024-07-12 21:29:47.540866] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.846 [2024-07-12 21:29:47.540968] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.846 [2024-07-12 21:29:47.541182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.541207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.846 [2024-07-12 21:29:47.541260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.541274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.846 [2024-07-12 21:29:47.541326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.541339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.846 [2024-07-12 21:29:47.541400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.541413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.846 #45 NEW cov: 11826 ft: 14750 corp: 32/512b lim: 30 exec/s: 45 rss: 70Mb L: 29/29 MS: 1 CopyPart- 00:07:08.846 [2024-07-12 21:29:47.580761] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5ad 00:07:08.846 [2024-07-12 21:29:47.580872] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:08.846 [2024-07-12 21:29:47.580976] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adb5 00:07:08.846 [2024-07-12 21:29:47.581175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.581200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.846 [2024-07-12 21:29:47.581250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.581264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.846 [2024-07-12 21:29:47.581316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ad518152 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.581330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.846 #46 NEW cov: 11826 ft: 14765 corp: 33/533b lim: 30 exec/s: 46 rss: 70Mb L: 21/29 MS: 1 EraseBytes- 00:07:08.846 [2024-07-12 21:29:47.620778] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:08.846 [2024-07-12 21:29:47.620975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.846 [2024-07-12 21:29:47.621001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.105 #47 NEW cov: 11826 ft: 14769 corp: 34/542b lim: 30 exec/s: 47 rss: 70Mb L: 9/29 MS: 1 ChangeBinInt- 00:07:09.105 [2024-07-12 21:29:47.660894] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000009b5 00:07:09.105 [2024-07-12 21:29:47.661093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.661117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.105 #48 NEW cov: 11826 ft: 14804 corp: 35/550b lim: 30 exec/s: 48 rss: 70Mb L: 8/29 MS: 1 EraseBytes- 00:07:09.105 [2024-07-12 21:29:47.691003] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:09.105 [2024-07-12 21:29:47.691199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eae81b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.691228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.105 #49 NEW cov: 11826 ft: 14867 corp: 36/559b lim: 30 exec/s: 49 rss: 70Mb L: 9/29 MS: 1 ChangeBinInt- 00:07:09.105 [2024-07-12 21:29:47.721050] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5ff 00:07:09.105 [2024-07-12 21:29:47.721161] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:09.105 [2024-07-12 21:29:47.721378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.721403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.105 [2024-07-12 21:29:47.721459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.721473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.105 #50 NEW cov: 11826 ft: 14904 corp: 37/575b lim: 30 exec/s: 50 rss: 70Mb L: 16/29 MS: 1 ChangeBinInt- 00:07:09.105 [2024-07-12 21:29:47.761248] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5ff 00:07:09.105 [2024-07-12 21:29:47.761360] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:09.105 [2024-07-12 21:29:47.761470] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:09.105 [2024-07-12 21:29:47.761572] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:09.105 [2024-07-12 21:29:47.761775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.761800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.105 [2024-07-12 21:29:47.761856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.761870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.105 [2024-07-12 21:29:47.761924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b583b5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.761937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.105 [2024-07-12 21:29:47.761993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.105 [2024-07-12 21:29:47.762006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.105 #51 NEW cov: 11826 ft: 14914 corp: 38/601b lim: 30 exec/s: 51 rss: 70Mb L: 26/29 MS: 1 ShuffleBytes- 00:07:09.106 [2024-07-12 21:29:47.801372] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adff 00:07:09.106 [2024-07-12 21:29:47.801503] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (177848) > buf size (4096) 00:07:09.106 [2024-07-12 21:29:47.801607] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:09.106 [2024-07-12 21:29:47.801801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.801826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.106 [2024-07-12 21:29:47.801881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad0005 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.801894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.106 [2024-07-12 21:29:47.801947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:515281ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.801960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.106 #52 NEW cov: 11826 ft: 14916 corp: 39/624b lim: 30 exec/s: 52 rss: 70Mb L: 23/29 MS: 1 PersAutoDict- DE: "\005\000\000\000"- 00:07:09.106 [2024-07-12 21:29:47.841433] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:09.106 [2024-07-12 21:29:47.841656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.841682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.106 #53 NEW cov: 11826 ft: 14925 corp: 40/631b lim: 30 exec/s: 53 rss: 70Mb L: 7/29 MS: 1 CrossOver- 00:07:09.106 [2024-07-12 21:29:47.881650] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adff 00:07:09.106 [2024-07-12 21:29:47.881761] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (177848) > buf size (4096) 00:07:09.106 [2024-07-12 21:29:47.881864] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x5152 00:07:09.106 [2024-07-12 21:29:47.881961] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000adad 00:07:09.106 [2024-07-12 21:29:47.882168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.882193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.106 [2024-07-12 21:29:47.882249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:adad0005 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.882263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.106 [2024-07-12 21:29:47.882316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.882329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.106 [2024-07-12 21:29:47.882384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:adad81ad cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.106 [2024-07-12 21:29:47.882398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.365 #54 NEW cov: 11826 ft: 14964 corp: 41/658b lim: 30 exec/s: 54 rss: 70Mb L: 27/29 MS: 1 InsertRepeatedBytes- 00:07:09.365 [2024-07-12 21:29:47.921734] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (572120) > buf size (4096) 00:07:09.365 [2024-07-12 21:29:47.921849] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (448216) > buf size (4096) 00:07:09.365 [2024-07-12 21:29:47.921954] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff2e 00:07:09.365 [2024-07-12 21:29:47.922154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb50209 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.365 [2024-07-12 21:29:47.922181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.365 [2024-07-12 21:29:47.922234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.365 [2024-07-12 21:29:47.922247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.365 [2024-07-12 21:29:47.922301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00b583b5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.365 [2024-07-12 21:29:47.922314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.365 #55 NEW cov: 11826 ft: 14989 corp: 42/681b lim: 30 exec/s: 55 rss: 70Mb L: 23/29 MS: 1 InsertByte- 00:07:09.365 [2024-07-12 21:29:47.961784] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000035b5 00:07:09.365 [2024-07-12 21:29:47.961984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb58109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.365 [2024-07-12 21:29:47.962014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.365 #56 NEW cov: 11826 ft: 15009 corp: 43/690b lim: 30 exec/s: 56 rss: 70Mb L: 9/29 MS: 1 ChangeBit- 00:07:09.365 [2024-07-12 21:29:47.991841] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b5b5 00:07:09.365 [2024-07-12 21:29:47.992062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2eb5819d cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:47.992087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.366 #57 NEW cov: 11826 ft: 15026 corp: 44/699b lim: 30 exec/s: 57 rss: 70Mb L: 9/29 MS: 1 ChangeByte- 00:07:09.366 [2024-07-12 21:29:48.022001] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002cb5 00:07:09.366 [2024-07-12 21:29:48.022116] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:07:09.366 [2024-07-12 21:29:48.022221] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002cb5 00:07:09.366 [2024-07-12 21:29:48.022423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:48.022454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.366 [2024-07-12 21:29:48.022511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:48.022525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.366 [2024-07-12 21:29:48.022581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:48.022594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.366 #58 NEW cov: 11826 ft: 15038 corp: 45/718b lim: 30 exec/s: 58 rss: 70Mb L: 19/29 MS: 1 ShuffleBytes- 00:07:09.366 [2024-07-12 21:29:48.062155] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002cb5 00:07:09.366 [2024-07-12 21:29:48.062265] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb5b5 00:07:09.366 [2024-07-12 21:29:48.062368] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002ce6 00:07:09.366 [2024-07-12 21:29:48.062479] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000e6e6 00:07:09.366 [2024-07-12 21:29:48.062676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:48.062702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.366 [2024-07-12 21:29:48.062754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:05000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:48.062768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.366 [2024-07-12 21:29:48.062821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b5b581b5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:48.062835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.366 [2024-07-12 21:29:48.062886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:e6e602e6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.366 [2024-07-12 21:29:48.062898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.366 #59 NEW cov: 11826 ft: 15051 corp: 46/747b lim: 30 exec/s: 29 rss: 70Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:09.366 #59 DONE cov: 11826 ft: 15051 corp: 46/747b lim: 30 exec/s: 29 rss: 70Mb 00:07:09.366 ###### Recommended dictionary. ###### 00:07:09.366 "\005\000\000\000" # Uses: 2 00:07:09.366 ###### End of recommended dictionary. ###### 00:07:09.366 Done 59 runs in 2 second(s) 00:07:09.625 21:29:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:09.625 21:29:48 -- ../common.sh@72 -- # (( i++ )) 00:07:09.625 21:29:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:09.625 21:29:48 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:09.625 21:29:48 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:09.625 21:29:48 -- nvmf/run.sh@24 -- # local timen=1 00:07:09.625 21:29:48 -- nvmf/run.sh@25 -- # local core=0x1 00:07:09.625 21:29:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:09.625 21:29:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:09.625 21:29:48 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:09.625 21:29:48 -- nvmf/run.sh@29 -- # port=4402 00:07:09.625 21:29:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:09.625 21:29:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:09.625 21:29:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:09.625 21:29:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:09.625 [2024-07-12 21:29:48.250009] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:09.625 [2024-07-12 21:29:48.250097] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3577931 ] 00:07:09.625 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.885 [2024-07-12 21:29:48.430143] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.885 [2024-07-12 21:29:48.493789] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:09.885 [2024-07-12 21:29:48.493929] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.885 [2024-07-12 21:29:48.551854] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:09.885 [2024-07-12 21:29:48.568141] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:09.885 INFO: Running with entropic power schedule (0xFF, 100). 00:07:09.885 INFO: Seed: 2613285980 00:07:09.885 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:09.885 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:09.885 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:09.885 INFO: A corpus is not provided, starting from an empty corpus 00:07:09.885 #2 INITED exec/s: 0 rss: 60Mb 00:07:09.885 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:09.885 This may also happen if the target rejected all inputs we tried so far 00:07:09.885 [2024-07-12 21:29:48.612913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.885 [2024-07-12 21:29:48.612947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.885 [2024-07-12 21:29:48.612997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.885 [2024-07-12 21:29:48.613012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.885 [2024-07-12 21:29:48.613040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.885 [2024-07-12 21:29:48.613056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.144 NEW_FUNC[1/670]: 0x484030 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:10.144 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:10.144 #7 NEW cov: 11502 ft: 11512 corp: 2/23b lim: 35 exec/s: 0 rss: 66Mb L: 22/22 MS: 5 ChangeByte-ChangeBit-ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:10.416 [2024-07-12 21:29:48.933707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:48.933743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:48.933792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:48.933808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:48.933837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:48.933852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:48.933881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:48.933899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.416 #13 NEW cov: 11624 ft: 12418 corp: 3/56b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:10.416 [2024-07-12 21:29:49.003759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad000a cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.003789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:49.003836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.003852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:49.003881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.003897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:49.003924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.003940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.416 #14 NEW cov: 11630 ft: 12665 corp: 4/87b lim: 35 exec/s: 0 rss: 67Mb L: 31/33 MS: 1 InsertRepeatedBytes- 00:07:10.416 [2024-07-12 21:29:49.053705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.053735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.416 #15 NEW cov: 11715 ft: 13315 corp: 5/94b lim: 35 exec/s: 0 rss: 67Mb L: 7/33 MS: 1 CrossOver- 00:07:10.416 [2024-07-12 21:29:49.103862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a4a400a4 cdw11:a400a4a4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.103890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.416 #16 NEW cov: 11715 ft: 13374 corp: 6/101b lim: 35 exec/s: 0 rss: 67Mb L: 7/33 MS: 1 InsertRepeatedBytes- 00:07:10.416 [2024-07-12 21:29:49.154118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:1700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.154147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:49.154195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.154211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:49.154240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.154255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.416 [2024-07-12 21:29:49.154283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.416 [2024-07-12 21:29:49.154297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.741 #17 NEW cov: 11715 ft: 13536 corp: 7/134b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:10.741 [2024-07-12 21:29:49.224358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.224388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.224420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.224436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.224473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.224488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.224516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.224547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.741 #18 NEW cov: 11715 ft: 13631 corp: 8/167b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:10.741 [2024-07-12 21:29:49.294429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.294464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.294511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.294527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.294555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.294570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.741 #19 NEW cov: 11715 ft: 13775 corp: 9/193b lim: 35 exec/s: 0 rss: 67Mb L: 26/33 MS: 1 CrossOver- 00:07:10.741 [2024-07-12 21:29:49.344493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ff0000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.344522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.741 #25 NEW cov: 11715 ft: 13884 corp: 10/200b lim: 35 exec/s: 0 rss: 67Mb L: 7/33 MS: 1 ChangeBinInt- 00:07:10.741 [2024-07-12 21:29:49.404772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.404801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.404848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.404865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.404896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.404912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.404945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:adad00ad cdw11:ad00ad0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.404961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.741 #26 NEW cov: 11715 ft: 13954 corp: 11/228b lim: 35 exec/s: 0 rss: 68Mb L: 28/33 MS: 1 CopyPart- 00:07:10.741 [2024-07-12 21:29:49.464969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.465001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.465035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.465053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.741 [2024-07-12 21:29:49.465082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.741 [2024-07-12 21:29:49.465098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.741 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:10.741 #27 NEW cov: 11732 ft: 13976 corp: 12/250b lim: 35 exec/s: 0 rss: 68Mb L: 22/33 MS: 1 CrossOver- 00:07:11.000 [2024-07-12 21:29:49.525214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.525247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.525281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:f400ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.525297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.525326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.525342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.525370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.525386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.000 #28 NEW cov: 11732 ft: 14009 corp: 13/283b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 ChangeByte- 00:07:11.000 [2024-07-12 21:29:49.575281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.575312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.575345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.575361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.575389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.575408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.575437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.575461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.000 #29 NEW cov: 11732 ft: 14061 corp: 14/313b lim: 35 exec/s: 29 rss: 69Mb L: 30/33 MS: 1 InsertRepeatedBytes- 00:07:11.000 [2024-07-12 21:29:49.625311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad00ad cdw11:ff00adff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.625341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.625389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ad0002ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.625406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.625434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.625456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.000 #30 NEW cov: 11732 ft: 14085 corp: 15/339b lim: 35 exec/s: 30 rss: 69Mb L: 26/33 MS: 1 CMP- DE: "\377\377\377\377\377\377\002\377"- 00:07:11.000 [2024-07-12 21:29:49.675313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a43f00a4 cdw11:a400a4a4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.675342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.000 #31 NEW cov: 11732 ft: 14097 corp: 16/346b lim: 35 exec/s: 31 rss: 69Mb L: 7/33 MS: 1 ChangeByte- 00:07:11.000 [2024-07-12 21:29:49.735610] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:11.000 [2024-07-12 21:29:49.735749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.735771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.735804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.735820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.735849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:307500ad cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.735864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.000 [2024-07-12 21:29:49.735893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00ad0000 cdw11:ad00ad0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.000 [2024-07-12 21:29:49.735909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.260 #32 NEW cov: 11741 ft: 14118 corp: 17/374b lim: 35 exec/s: 32 rss: 69Mb L: 28/33 MS: 1 CMP- DE: "0u\000\000\000\000\000\000"- 00:07:11.260 [2024-07-12 21:29:49.795718] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:11.260 [2024-07-12 21:29:49.795884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.795927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.795960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:75000030 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.795976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.796004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.796021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.796050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.796065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.260 #33 NEW cov: 11741 ft: 14141 corp: 18/404b lim: 35 exec/s: 33 rss: 69Mb L: 30/33 MS: 1 PersAutoDict- DE: "0u\000\000\000\000\000\000"- 00:07:11.260 [2024-07-12 21:29:49.865885] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:11.260 [2024-07-12 21:29:49.866064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.866086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.866118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:75000030 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.866134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.866161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:4000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.866177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.866205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.866220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.260 #34 NEW cov: 11741 ft: 14187 corp: 19/434b lim: 35 exec/s: 34 rss: 69Mb L: 30/33 MS: 1 ChangeByte- 00:07:11.260 [2024-07-12 21:29:49.927103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad000a cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.927132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.927192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.927207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.927264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.927278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.260 [2024-07-12 21:29:49.927338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.927356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.260 #35 NEW cov: 11741 ft: 14290 corp: 20/465b lim: 35 exec/s: 35 rss: 69Mb L: 31/33 MS: 1 ShuffleBytes- 00:07:11.260 [2024-07-12 21:29:49.967187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.260 [2024-07-12 21:29:49.967212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.261 [2024-07-12 21:29:49.967284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:f400ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.261 [2024-07-12 21:29:49.967298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.261 [2024-07-12 21:29:49.967355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.261 [2024-07-12 21:29:49.967368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.261 [2024-07-12 21:29:49.967424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.261 [2024-07-12 21:29:49.967438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.261 #36 NEW cov: 11741 ft: 14338 corp: 21/498b lim: 35 exec/s: 36 rss: 69Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:11.261 [2024-07-12 21:29:50.007180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.261 [2024-07-12 21:29:50.007206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.261 [2024-07-12 21:29:50.007266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.261 [2024-07-12 21:29:50.007281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.261 [2024-07-12 21:29:50.007342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.261 [2024-07-12 21:29:50.007356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.261 #37 NEW cov: 11741 ft: 14380 corp: 22/525b lim: 35 exec/s: 37 rss: 69Mb L: 27/33 MS: 1 CrossOver- 00:07:11.520 [2024-07-12 21:29:50.047427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.047457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.047518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:f400ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.047533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.047590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.047604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.047660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.047678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.520 #38 NEW cov: 11741 ft: 14387 corp: 23/558b lim: 35 exec/s: 38 rss: 69Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:11.520 [2024-07-12 21:29:50.087132] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:11.520 [2024-07-12 21:29:50.087250] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:11.520 [2024-07-12 21:29:50.087599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.087627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.087691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:7500ff30 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.087709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.087769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.087785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.087844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff4000ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.087859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.520 #39 NEW cov: 11741 ft: 14448 corp: 24/592b lim: 35 exec/s: 39 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:11.520 [2024-07-12 21:29:50.127665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.127690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.127763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.127778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.127832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.127846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.520 [2024-07-12 21:29:50.127900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.520 [2024-07-12 21:29:50.127914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.520 #40 NEW cov: 11741 ft: 14478 corp: 25/626b lim: 35 exec/s: 40 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:11.521 [2024-07-12 21:29:50.167818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.167843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.167915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.167931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.167987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00feff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.168000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.168058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.168071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.521 #41 NEW cov: 11741 ft: 14507 corp: 26/657b lim: 35 exec/s: 41 rss: 69Mb L: 31/34 MS: 1 InsertByte- 00:07:11.521 [2024-07-12 21:29:50.207457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ff0000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.207482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.521 #42 NEW cov: 11741 ft: 14565 corp: 27/664b lim: 35 exec/s: 42 rss: 69Mb L: 7/34 MS: 1 CopyPart- 00:07:11.521 [2024-07-12 21:29:50.248016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:1700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.248041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.248099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.248112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.248169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.248182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.248241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.248254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.521 #43 NEW cov: 11741 ft: 14640 corp: 28/697b lim: 35 exec/s: 43 rss: 69Mb L: 33/34 MS: 1 ChangeBinInt- 00:07:11.521 [2024-07-12 21:29:50.288132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:1700ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.288157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.288214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.288228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.288285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:17170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.288298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.521 [2024-07-12 21:29:50.288351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:21170017 cdw11:17001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.521 [2024-07-12 21:29:50.288365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.780 #44 NEW cov: 11741 ft: 14679 corp: 29/730b lim: 35 exec/s: 44 rss: 69Mb L: 33/34 MS: 1 ChangeBinInt- 00:07:11.780 [2024-07-12 21:29:50.327867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fff700ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.780 [2024-07-12 21:29:50.327892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.780 #45 NEW cov: 11741 ft: 14748 corp: 30/737b lim: 35 exec/s: 45 rss: 69Mb L: 7/34 MS: 1 ChangeBit- 00:07:11.780 [2024-07-12 21:29:50.368416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000021 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.780 [2024-07-12 21:29:50.368445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.780 [2024-07-12 21:29:50.368503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.780 [2024-07-12 21:29:50.368517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.780 [2024-07-12 21:29:50.368575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.780 [2024-07-12 21:29:50.368589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.780 [2024-07-12 21:29:50.368646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.780 [2024-07-12 21:29:50.368659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.781 #46 NEW cov: 11741 ft: 14848 corp: 31/771b lim: 35 exec/s: 46 rss: 69Mb L: 34/34 MS: 1 ChangeByte- 00:07:11.781 [2024-07-12 21:29:50.408497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.408522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.408581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:51005151 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.408595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.408655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.408669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.408726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.408739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.781 #47 NEW cov: 11741 ft: 14861 corp: 32/801b lim: 35 exec/s: 47 rss: 70Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:07:11.781 [2024-07-12 21:29:50.448623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000021 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.448647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.448705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.448724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.448781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.448795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.448851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.448865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.781 #48 NEW cov: 11741 ft: 14863 corp: 33/835b lim: 35 exec/s: 48 rss: 70Mb L: 34/34 MS: 1 ChangeByte- 00:07:11.781 [2024-07-12 21:29:50.488747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.488772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.488831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.488845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.488901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:307500ad cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.488915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.488971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.488985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.781 #49 NEW cov: 11748 ft: 14897 corp: 34/869b lim: 35 exec/s: 49 rss: 70Mb L: 34/34 MS: 1 CrossOver- 00:07:11.781 [2024-07-12 21:29:50.528745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:adad00ad cdw11:ff00adff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.528770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.528845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ad00027f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.528859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.781 [2024-07-12 21:29:50.528918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:adad00ad cdw11:ad00adad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:11.781 [2024-07-12 21:29:50.528932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.781 #50 NEW cov: 11748 ft: 14901 corp: 35/895b lim: 35 exec/s: 50 rss: 70Mb L: 26/34 MS: 1 ChangeBit- 00:07:12.041 [2024-07-12 21:29:50.568846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a9ff00a9 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.041 [2024-07-12 21:29:50.568871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.041 [2024-07-12 21:29:50.568929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.041 [2024-07-12 21:29:50.568943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.041 [2024-07-12 21:29:50.569002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.041 [2024-07-12 21:29:50.569016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.041 #51 NEW cov: 11748 ft: 14923 corp: 36/921b lim: 35 exec/s: 51 rss: 70Mb L: 26/34 MS: 1 CrossOver- 00:07:12.041 [2024-07-12 21:29:50.608676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff002d cdw11:ff00f7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.041 [2024-07-12 21:29:50.608702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.041 #52 NEW cov: 11748 ft: 14960 corp: 37/929b lim: 35 exec/s: 26 rss: 70Mb L: 8/34 MS: 1 InsertByte- 00:07:12.041 #52 DONE cov: 11748 ft: 14960 corp: 37/929b lim: 35 exec/s: 26 rss: 70Mb 00:07:12.041 ###### Recommended dictionary. ###### 00:07:12.041 "\377\377\377\377\377\377\002\377" # Uses: 0 00:07:12.041 "0u\000\000\000\000\000\000" # Uses: 1 00:07:12.041 ###### End of recommended dictionary. ###### 00:07:12.041 Done 52 runs in 2 second(s) 00:07:12.041 21:29:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:12.041 21:29:50 -- ../common.sh@72 -- # (( i++ )) 00:07:12.041 21:29:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:12.041 21:29:50 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:12.041 21:29:50 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:12.041 21:29:50 -- nvmf/run.sh@24 -- # local timen=1 00:07:12.041 21:29:50 -- nvmf/run.sh@25 -- # local core=0x1 00:07:12.041 21:29:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:12.041 21:29:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:12.041 21:29:50 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:12.041 21:29:50 -- nvmf/run.sh@29 -- # port=4403 00:07:12.041 21:29:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:12.041 21:29:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:12.041 21:29:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:12.041 21:29:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:12.041 [2024-07-12 21:29:50.784935] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:12.041 [2024-07-12 21:29:50.784993] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3578317 ] 00:07:12.041 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.301 [2024-07-12 21:29:50.958225] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.301 [2024-07-12 21:29:51.022912] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:12.301 [2024-07-12 21:29:51.023050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.301 [2024-07-12 21:29:51.081093] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:12.560 [2024-07-12 21:29:51.097386] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:12.560 INFO: Running with entropic power schedule (0xFF, 100). 00:07:12.560 INFO: Seed: 847331936 00:07:12.560 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:12.560 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:12.560 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:12.560 INFO: A corpus is not provided, starting from an empty corpus 00:07:12.560 #2 INITED exec/s: 0 rss: 60Mb 00:07:12.560 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:12.560 This may also happen if the target rejected all inputs we tried so far 00:07:12.819 NEW_FUNC[1/659]: 0x485d00 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:12.819 NEW_FUNC[2/659]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:12.819 #19 NEW cov: 11409 ft: 11410 corp: 2/5b lim: 20 exec/s: 0 rss: 67Mb L: 4/4 MS: 2 InsertByte-CopyPart- 00:07:12.819 [2024-07-12 21:29:51.463752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.819 [2024-07-12 21:29:51.463790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.819 NEW_FUNC[1/20]: 0x113e4b0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:12.819 NEW_FUNC[2/20]: 0x113f030 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:12.819 #23 NEW cov: 11861 ft: 12704 corp: 3/19b lim: 20 exec/s: 0 rss: 67Mb L: 14/14 MS: 4 ChangeBinInt-CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:12.819 [2024-07-12 21:29:51.513991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.819 [2024-07-12 21:29:51.514019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.819 #24 NEW cov: 11887 ft: 13139 corp: 4/35b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 CopyPart- 00:07:12.819 [2024-07-12 21:29:51.554078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.819 [2024-07-12 21:29:51.554103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.819 #27 NEW cov: 11972 ft: 13398 corp: 5/51b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 3 ChangeBinInt-CopyPart-CrossOver- 00:07:12.820 [2024-07-12 21:29:51.593744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:12.820 [2024-07-12 21:29:51.593769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.079 #30 NEW cov: 11972 ft: 13500 corp: 6/58b lim: 20 exec/s: 0 rss: 67Mb L: 7/16 MS: 3 ShuffleBytes-ChangeBit-CrossOver- 00:07:13.079 #31 NEW cov: 11972 ft: 13613 corp: 7/63b lim: 20 exec/s: 0 rss: 67Mb L: 5/16 MS: 1 CrossOver- 00:07:13.079 [2024-07-12 21:29:51.664361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.079 [2024-07-12 21:29:51.664389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.079 #32 NEW cov: 11972 ft: 13766 corp: 8/81b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 CopyPart- 00:07:13.079 [2024-07-12 21:29:51.714515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.079 [2024-07-12 21:29:51.714541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.079 #33 NEW cov: 11972 ft: 13792 corp: 9/98b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 InsertByte- 00:07:13.079 #34 NEW cov: 11972 ft: 13833 corp: 10/115b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 InsertRepeatedBytes- 00:07:13.079 [2024-07-12 21:29:51.794738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.079 [2024-07-12 21:29:51.794765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.079 #35 NEW cov: 11972 ft: 14006 corp: 11/132b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 InsertByte- 00:07:13.338 #36 NEW cov: 11972 ft: 14024 corp: 12/136b lim: 20 exec/s: 0 rss: 68Mb L: 4/18 MS: 1 CopyPart- 00:07:13.338 [2024-07-12 21:29:51.884888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.338 [2024-07-12 21:29:51.884914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.338 #37 NEW cov: 11972 ft: 14138 corp: 13/150b lim: 20 exec/s: 0 rss: 68Mb L: 14/18 MS: 1 EraseBytes- 00:07:13.338 #38 NEW cov: 11972 ft: 14214 corp: 14/167b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 ChangeByte- 00:07:13.338 [2024-07-12 21:29:51.965218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.338 [2024-07-12 21:29:51.965243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.338 #39 NEW cov: 11972 ft: 14300 corp: 15/184b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 ChangeBinInt- 00:07:13.338 [2024-07-12 21:29:52.005318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.338 [2024-07-12 21:29:52.005343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.338 #40 NEW cov: 11972 ft: 14370 corp: 16/201b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 ShuffleBytes- 00:07:13.338 [2024-07-12 21:29:52.045449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.339 [2024-07-12 21:29:52.045475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.339 #41 NEW cov: 11972 ft: 14384 corp: 17/218b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 InsertRepeatedBytes- 00:07:13.339 [2024-07-12 21:29:52.085621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.339 [2024-07-12 21:29:52.085647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.339 #42 NEW cov: 11972 ft: 14428 corp: 18/235b lim: 20 exec/s: 0 rss: 68Mb L: 17/18 MS: 1 ChangeByte- 00:07:13.598 [2024-07-12 21:29:52.125763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.598 [2024-07-12 21:29:52.125790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.598 #43 NEW cov: 11972 ft: 14447 corp: 19/253b lim: 20 exec/s: 43 rss: 68Mb L: 18/18 MS: 1 ShuffleBytes- 00:07:13.598 [2024-07-12 21:29:52.176016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.598 [2024-07-12 21:29:52.176040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.598 #44 NEW cov: 11972 ft: 14564 corp: 20/273b lim: 20 exec/s: 44 rss: 68Mb L: 20/20 MS: 1 CopyPart- 00:07:13.598 [2024-07-12 21:29:52.216033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.598 [2024-07-12 21:29:52.216059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.598 #45 NEW cov: 11972 ft: 14573 corp: 21/289b lim: 20 exec/s: 45 rss: 68Mb L: 16/20 MS: 1 ShuffleBytes- 00:07:13.598 [2024-07-12 21:29:52.255711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.598 [2024-07-12 21:29:52.255735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.598 #46 NEW cov: 11972 ft: 14672 corp: 22/296b lim: 20 exec/s: 46 rss: 68Mb L: 7/20 MS: 1 CopyPart- 00:07:13.598 [2024-07-12 21:29:52.296183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.598 [2024-07-12 21:29:52.296212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.598 #47 NEW cov: 11972 ft: 14692 corp: 23/313b lim: 20 exec/s: 47 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:13.598 [2024-07-12 21:29:52.336253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.598 [2024-07-12 21:29:52.336278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.598 #48 NEW cov: 11972 ft: 14734 corp: 24/330b lim: 20 exec/s: 48 rss: 69Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:13.857 #49 NEW cov: 11972 ft: 14756 corp: 25/347b lim: 20 exec/s: 49 rss: 69Mb L: 17/20 MS: 1 ChangeBit- 00:07:13.857 [2024-07-12 21:29:52.416465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.857 [2024-07-12 21:29:52.416491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.857 #50 NEW cov: 11972 ft: 14772 corp: 26/364b lim: 20 exec/s: 50 rss: 69Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:13.857 #51 NEW cov: 11972 ft: 14778 corp: 27/381b lim: 20 exec/s: 51 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:13.857 [2024-07-12 21:29:52.496746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.857 [2024-07-12 21:29:52.496772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.857 #52 NEW cov: 11972 ft: 14784 corp: 28/398b lim: 20 exec/s: 52 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:13.857 [2024-07-12 21:29:52.547008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.857 [2024-07-12 21:29:52.547035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.857 #53 NEW cov: 11972 ft: 14831 corp: 29/414b lim: 20 exec/s: 53 rss: 69Mb L: 16/20 MS: 1 ChangeBit- 00:07:13.857 [2024-07-12 21:29:52.586994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.857 [2024-07-12 21:29:52.587020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.857 #54 NEW cov: 11972 ft: 14835 corp: 30/431b lim: 20 exec/s: 54 rss: 69Mb L: 17/20 MS: 1 InsertByte- 00:07:13.857 [2024-07-12 21:29:52.627391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:13.857 [2024-07-12 21:29:52.627418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.117 #55 NEW cov: 11972 ft: 14900 corp: 31/451b lim: 20 exec/s: 55 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:14.117 [2024-07-12 21:29:52.677428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.117 [2024-07-12 21:29:52.677460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.117 #56 NEW cov: 11972 ft: 14917 corp: 32/471b lim: 20 exec/s: 56 rss: 69Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:14.117 #57 NEW cov: 11972 ft: 14962 corp: 33/484b lim: 20 exec/s: 57 rss: 69Mb L: 13/20 MS: 1 InsertRepeatedBytes- 00:07:14.117 [2024-07-12 21:29:52.757136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.117 [2024-07-12 21:29:52.757163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.117 #58 NEW cov: 11972 ft: 14987 corp: 34/491b lim: 20 exec/s: 58 rss: 69Mb L: 7/20 MS: 1 CopyPart- 00:07:14.117 #59 NEW cov: 11972 ft: 15062 corp: 35/508b lim: 20 exec/s: 59 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:14.117 [2024-07-12 21:29:52.837518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.117 [2024-07-12 21:29:52.837544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.117 #60 NEW cov: 11972 ft: 15091 corp: 36/522b lim: 20 exec/s: 60 rss: 69Mb L: 14/20 MS: 1 ChangeBinInt- 00:07:14.117 #61 NEW cov: 11972 ft: 15099 corp: 37/538b lim: 20 exec/s: 61 rss: 69Mb L: 16/20 MS: 1 InsertRepeatedBytes- 00:07:14.376 [2024-07-12 21:29:52.917768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.376 [2024-07-12 21:29:52.917795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.376 [2024-07-12 21:29:52.917919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.376 [2024-07-12 21:29:52.917936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:1 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.376 #62 NEW cov: 11974 ft: 15501 corp: 38/546b lim: 20 exec/s: 62 rss: 69Mb L: 8/20 MS: 1 CrossOver- 00:07:14.376 [2024-07-12 21:29:52.968106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.376 [2024-07-12 21:29:52.968132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.376 #63 NEW cov: 11974 ft: 15512 corp: 39/564b lim: 20 exec/s: 63 rss: 69Mb L: 18/20 MS: 1 CrossOver- 00:07:14.376 [2024-07-12 21:29:53.008404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.376 [2024-07-12 21:29:53.008431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.376 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:14.376 #64 NEW cov: 11997 ft: 15527 corp: 40/584b lim: 20 exec/s: 64 rss: 70Mb L: 20/20 MS: 1 CrossOver- 00:07:14.376 [2024-07-12 21:29:53.058687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.376 [2024-07-12 21:29:53.058712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:3 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.376 [2024-07-12 21:29:53.058852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:14.376 [2024-07-12 21:29:53.058869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.376 #65 NEW cov: 11998 ft: 15933 corp: 41/604b lim: 20 exec/s: 65 rss: 70Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:14.376 #66 NEW cov: 11998 ft: 15940 corp: 42/608b lim: 20 exec/s: 33 rss: 70Mb L: 4/20 MS: 1 ChangeByte- 00:07:14.376 #66 DONE cov: 11998 ft: 15940 corp: 42/608b lim: 20 exec/s: 33 rss: 70Mb 00:07:14.376 Done 66 runs in 2 second(s) 00:07:14.635 21:29:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:14.635 21:29:53 -- ../common.sh@72 -- # (( i++ )) 00:07:14.635 21:29:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:14.635 21:29:53 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:14.635 21:29:53 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:14.635 21:29:53 -- nvmf/run.sh@24 -- # local timen=1 00:07:14.635 21:29:53 -- nvmf/run.sh@25 -- # local core=0x1 00:07:14.635 21:29:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:14.635 21:29:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:14.635 21:29:53 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:14.635 21:29:53 -- nvmf/run.sh@29 -- # port=4404 00:07:14.635 21:29:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:14.635 21:29:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:14.635 21:29:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:14.636 21:29:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:14.636 [2024-07-12 21:29:53.280657] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:14.636 [2024-07-12 21:29:53.280710] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3578854 ] 00:07:14.636 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.895 [2024-07-12 21:29:53.460042] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.895 [2024-07-12 21:29:53.523224] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:14.895 [2024-07-12 21:29:53.523351] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.895 [2024-07-12 21:29:53.581193] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:14.895 [2024-07-12 21:29:53.597477] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:14.895 INFO: Running with entropic power schedule (0xFF, 100). 00:07:14.895 INFO: Seed: 3348306195 00:07:14.895 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:14.895 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:14.895 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:14.895 INFO: A corpus is not provided, starting from an empty corpus 00:07:14.895 #2 INITED exec/s: 0 rss: 60Mb 00:07:14.895 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:14.895 This may also happen if the target rejected all inputs we tried so far 00:07:14.895 [2024-07-12 21:29:53.642365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.895 [2024-07-12 21:29:53.642397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.895 [2024-07-12 21:29:53.642453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.895 [2024-07-12 21:29:53.642469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.895 [2024-07-12 21:29:53.642498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.895 [2024-07-12 21:29:53.642514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.895 [2024-07-12 21:29:53.642542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.895 [2024-07-12 21:29:53.642557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.895 [2024-07-12 21:29:53.642584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.895 [2024-07-12 21:29:53.642599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.413 NEW_FUNC[1/671]: 0x486df0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:15.413 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:15.413 #3 NEW cov: 11532 ft: 11532 corp: 2/36b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:15.413 [2024-07-12 21:29:53.972914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:53.972951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.413 #4 NEW cov: 11645 ft: 12991 corp: 3/45b lim: 35 exec/s: 0 rss: 67Mb L: 9/35 MS: 1 CrossOver- 00:07:15.413 [2024-07-12 21:29:54.032953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.032982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.413 #5 NEW cov: 11651 ft: 13271 corp: 4/54b lim: 35 exec/s: 0 rss: 67Mb L: 9/35 MS: 1 ChangeBinInt- 00:07:15.413 [2024-07-12 21:29:54.103399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.103429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.413 [2024-07-12 21:29:54.103484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.103502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.413 [2024-07-12 21:29:54.103533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.103548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.413 [2024-07-12 21:29:54.103577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.103592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.413 [2024-07-12 21:29:54.103622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.103637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.413 #11 NEW cov: 11736 ft: 13531 corp: 5/89b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:15.413 [2024-07-12 21:29:54.173380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.173411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.413 [2024-07-12 21:29:54.173451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.413 [2024-07-12 21:29:54.173467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.672 #12 NEW cov: 11736 ft: 13817 corp: 6/105b lim: 35 exec/s: 0 rss: 67Mb L: 16/35 MS: 1 CrossOver- 00:07:15.672 [2024-07-12 21:29:54.233580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.233620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.233671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.233688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.672 #13 NEW cov: 11736 ft: 13870 corp: 7/122b lim: 35 exec/s: 0 rss: 67Mb L: 17/35 MS: 1 CMP- DE: "u\333XCE\375(\000"- 00:07:15.672 [2024-07-12 21:29:54.293712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.293741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.293788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.293804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.672 #14 NEW cov: 11736 ft: 14002 corp: 8/139b lim: 35 exec/s: 0 rss: 68Mb L: 17/35 MS: 1 ShuffleBytes- 00:07:15.672 [2024-07-12 21:29:54.364077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.364106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.364154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.364169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.364197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.364212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.364240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.364254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.364282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.364296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.672 #15 NEW cov: 11736 ft: 14052 corp: 9/174b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:15.672 [2024-07-12 21:29:54.424103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.424132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.424179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:db580075 cdw11:43450003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.424195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.672 [2024-07-12 21:29:54.424224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.672 [2024-07-12 21:29:54.424240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 #16 NEW cov: 11736 ft: 14292 corp: 10/199b lim: 35 exec/s: 0 rss: 68Mb L: 25/35 MS: 1 PersAutoDict- DE: "u\333XCE\375(\000"- 00:07:15.931 [2024-07-12 21:29:54.474143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.474171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.931 [2024-07-12 21:29:54.474218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:54000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.474234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 #17 NEW cov: 11736 ft: 14370 corp: 11/216b lim: 35 exec/s: 0 rss: 68Mb L: 17/35 MS: 1 ChangeByte- 00:07:15.931 [2024-07-12 21:29:54.524298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00080000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.524328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.931 [2024-07-12 21:29:54.524361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.524376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:15.931 #23 NEW cov: 11753 ft: 14432 corp: 12/232b lim: 35 exec/s: 0 rss: 68Mb L: 16/35 MS: 1 ChangeBinInt- 00:07:15.931 [2024-07-12 21:29:54.594534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.594562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.931 [2024-07-12 21:29:54.594609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.594624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 #24 NEW cov: 11753 ft: 14452 corp: 13/249b lim: 35 exec/s: 24 rss: 68Mb L: 17/35 MS: 1 InsertByte- 00:07:15.931 [2024-07-12 21:29:54.654728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.654756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.931 [2024-07-12 21:29:54.654804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.654819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 [2024-07-12 21:29:54.654848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-07-12 21:29:54.654863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 #25 NEW cov: 11753 ft: 14458 corp: 14/270b lim: 35 exec/s: 25 rss: 68Mb L: 21/35 MS: 1 CrossOver- 00:07:16.191 [2024-07-12 21:29:54.714939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.714970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.715007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.715024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.715053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.715069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.191 #26 NEW cov: 11753 ft: 14473 corp: 15/291b lim: 35 exec/s: 26 rss: 68Mb L: 21/35 MS: 1 ChangeBit- 00:07:16.191 [2024-07-12 21:29:54.775141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.775172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.775204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:db580075 cdw11:43450003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.775220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.775248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00002800 cdw11:00820001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.775263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.775291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:82008282 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.775322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.191 #27 NEW cov: 11753 ft: 14518 corp: 16/321b lim: 35 exec/s: 27 rss: 69Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:07:16.191 [2024-07-12 21:29:54.835170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.835200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.835232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.835248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 #28 NEW cov: 11753 ft: 14529 corp: 17/338b lim: 35 exec/s: 28 rss: 69Mb L: 17/35 MS: 1 ChangeBit- 00:07:16.191 [2024-07-12 21:29:54.885222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.885250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.885297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.885312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 #29 NEW cov: 11753 ft: 14572 corp: 18/354b lim: 35 exec/s: 29 rss: 69Mb L: 16/35 MS: 1 ChangeBit- 00:07:16.191 [2024-07-12 21:29:54.935582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.935620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.935671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.935686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.935715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.935730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.935757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.935772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.191 [2024-07-12 21:29:54.935800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-07-12 21:29:54.935815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.450 #30 NEW cov: 11753 ft: 14585 corp: 19/389b lim: 35 exec/s: 30 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:16.450 [2024-07-12 21:29:54.995518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00007500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-07-12 21:29:54.995548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.450 #31 NEW cov: 11753 ft: 14599 corp: 20/400b lim: 35 exec/s: 31 rss: 69Mb L: 11/35 MS: 1 EraseBytes- 00:07:16.450 [2024-07-12 21:29:55.045731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00080000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-07-12 21:29:55.045760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.450 [2024-07-12 21:29:55.045807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-07-12 21:29:55.045823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.450 #32 NEW cov: 11753 ft: 14700 corp: 21/416b lim: 35 exec/s: 32 rss: 69Mb L: 16/35 MS: 1 CMP- DE: "\000\000"- 00:07:16.450 [2024-07-12 21:29:55.095795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:dc000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-07-12 21:29:55.095824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.450 #33 NEW cov: 11753 ft: 14727 corp: 22/426b lim: 35 exec/s: 33 rss: 69Mb L: 10/35 MS: 1 InsertByte- 00:07:16.450 [2024-07-12 21:29:55.145872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-07-12 21:29:55.145903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.450 #34 NEW cov: 11753 ft: 14784 corp: 23/435b lim: 35 exec/s: 34 rss: 69Mb L: 9/35 MS: 1 ShuffleBytes- 00:07:16.450 [2024-07-12 21:29:55.206110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-07-12 21:29:55.206138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.450 [2024-07-12 21:29:55.206189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-07-12 21:29:55.206204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 #35 NEW cov: 11753 ft: 14790 corp: 24/454b lim: 35 exec/s: 35 rss: 69Mb L: 19/35 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:16.709 [2024-07-12 21:29:55.266438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.266487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.266520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.266535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.266563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.266578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.266606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.266620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.266648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.266662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.709 #36 NEW cov: 11753 ft: 14799 corp: 25/489b lim: 35 exec/s: 36 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:16.709 [2024-07-12 21:29:55.316351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:45fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.316379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.316426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:28000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.316448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 #37 NEW cov: 11753 ft: 14823 corp: 26/506b lim: 35 exec/s: 37 rss: 69Mb L: 17/35 MS: 1 ShuffleBytes- 00:07:16.709 [2024-07-12 21:29:55.366697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.366727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.366759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.366774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.366802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.366818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.366849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.366864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.366891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.366906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.709 #38 NEW cov: 11753 ft: 14915 corp: 27/541b lim: 35 exec/s: 38 rss: 69Mb L: 35/35 MS: 1 ChangeBit- 00:07:16.709 [2024-07-12 21:29:55.416743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0075000a cdw11:75750002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.416771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.416818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:75757575 cdw11:75750002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.416833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.416861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:75757575 cdw11:75750002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.416876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.416904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00007500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.416918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.709 #39 NEW cov: 11753 ft: 14948 corp: 28/569b lim: 35 exec/s: 39 rss: 69Mb L: 28/35 MS: 1 InsertRepeatedBytes- 00:07:16.709 [2024-07-12 21:29:55.466970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.466999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.467031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.467046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.467074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.467090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.467117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.467132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.709 [2024-07-12 21:29:55.467159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-07-12 21:29:55.467174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:16.969 #40 NEW cov: 11753 ft: 14983 corp: 29/604b lim: 35 exec/s: 40 rss: 69Mb L: 35/35 MS: 1 ChangeBit- 00:07:16.969 [2024-07-12 21:29:55.526932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.969 [2024-07-12 21:29:55.526961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.969 #41 NEW cov: 11760 ft: 15002 corp: 30/614b lim: 35 exec/s: 41 rss: 69Mb L: 10/35 MS: 1 CopyPart- 00:07:16.969 [2024-07-12 21:29:55.597165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:584375db cdw11:77fd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.969 [2024-07-12 21:29:55.597195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.969 [2024-07-12 21:29:55.597228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.969 [2024-07-12 21:29:55.597244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.969 #42 NEW cov: 11760 ft: 15016 corp: 31/631b lim: 35 exec/s: 21 rss: 69Mb L: 17/35 MS: 1 ChangeByte- 00:07:16.969 #42 DONE cov: 11760 ft: 15016 corp: 31/631b lim: 35 exec/s: 21 rss: 69Mb 00:07:16.969 ###### Recommended dictionary. ###### 00:07:16.969 "u\333XCE\375(\000" # Uses: 1 00:07:16.969 "\000\000" # Uses: 1 00:07:16.969 ###### End of recommended dictionary. ###### 00:07:16.969 Done 42 runs in 2 second(s) 00:07:17.229 21:29:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:17.229 21:29:55 -- ../common.sh@72 -- # (( i++ )) 00:07:17.229 21:29:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:17.229 21:29:55 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:17.229 21:29:55 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:17.229 21:29:55 -- nvmf/run.sh@24 -- # local timen=1 00:07:17.229 21:29:55 -- nvmf/run.sh@25 -- # local core=0x1 00:07:17.229 21:29:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:17.229 21:29:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:17.229 21:29:55 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:17.229 21:29:55 -- nvmf/run.sh@29 -- # port=4405 00:07:17.229 21:29:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:17.229 21:29:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:17.229 21:29:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:17.229 21:29:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:17.229 [2024-07-12 21:29:55.798968] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:17.229 [2024-07-12 21:29:55.799022] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3579227 ] 00:07:17.229 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.229 [2024-07-12 21:29:56.000354] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.488 [2024-07-12 21:29:56.065188] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:17.488 [2024-07-12 21:29:56.065330] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.488 [2024-07-12 21:29:56.123359] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:17.488 [2024-07-12 21:29:56.139557] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:17.488 INFO: Running with entropic power schedule (0xFF, 100). 00:07:17.488 INFO: Seed: 1592346249 00:07:17.488 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:17.488 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:17.488 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:17.488 INFO: A corpus is not provided, starting from an empty corpus 00:07:17.488 #2 INITED exec/s: 0 rss: 59Mb 00:07:17.489 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:17.489 This may also happen if the target rejected all inputs we tried so far 00:07:17.489 [2024-07-12 21:29:56.208999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.489 [2024-07-12 21:29:56.209036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.489 [2024-07-12 21:29:56.209159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:17.489 [2024-07-12 21:29:56.209177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.747 NEW_FUNC[1/671]: 0x488f80 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:17.748 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:17.748 #16 NEW cov: 11543 ft: 11542 corp: 2/27b lim: 45 exec/s: 0 rss: 66Mb L: 26/26 MS: 4 ShuffleBytes-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:18.006 [2024-07-12 21:29:56.539686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.006 [2024-07-12 21:29:56.539726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.006 [2024-07-12 21:29:56.539850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.006 [2024-07-12 21:29:56.539867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.006 #17 NEW cov: 11656 ft: 12078 corp: 3/45b lim: 45 exec/s: 0 rss: 66Mb L: 18/26 MS: 1 EraseBytes- 00:07:18.006 [2024-07-12 21:29:56.589446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.006 [2024-07-12 21:29:56.589476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.006 #23 NEW cov: 11662 ft: 13071 corp: 4/60b lim: 45 exec/s: 0 rss: 66Mb L: 15/26 MS: 1 EraseBytes- 00:07:18.006 [2024-07-12 21:29:56.629592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2828a523 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.006 [2024-07-12 21:29:56.629620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.006 #28 NEW cov: 11747 ft: 13459 corp: 5/72b lim: 45 exec/s: 0 rss: 66Mb L: 12/26 MS: 5 ChangeBit-InsertByte-InsertByte-ChangeByte-CrossOver- 00:07:18.006 [2024-07-12 21:29:56.669915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.006 [2024-07-12 21:29:56.669942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.006 [2024-07-12 21:29:56.670060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.006 [2024-07-12 21:29:56.670076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.006 #29 NEW cov: 11747 ft: 13578 corp: 6/90b lim: 45 exec/s: 0 rss: 66Mb L: 18/26 MS: 1 ShuffleBytes- 00:07:18.006 [2024-07-12 21:29:56.710126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.007 [2024-07-12 21:29:56.710153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.007 [2024-07-12 21:29:56.710271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.007 [2024-07-12 21:29:56.710287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.007 #30 NEW cov: 11747 ft: 13646 corp: 7/116b lim: 45 exec/s: 0 rss: 66Mb L: 26/26 MS: 1 ShuffleBytes- 00:07:18.007 [2024-07-12 21:29:56.750774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.007 [2024-07-12 21:29:56.750801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.007 [2024-07-12 21:29:56.750923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.007 [2024-07-12 21:29:56.750939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.007 [2024-07-12 21:29:56.751062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.007 [2024-07-12 21:29:56.751077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.007 [2024-07-12 21:29:56.751202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.007 [2024-07-12 21:29:56.751218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.007 #31 NEW cov: 11747 ft: 14045 corp: 8/153b lim: 45 exec/s: 0 rss: 66Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:18.266 [2024-07-12 21:29:56.800368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1a280b00 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.800396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:56.800513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.800542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.266 #32 NEW cov: 11747 ft: 14083 corp: 9/179b lim: 45 exec/s: 0 rss: 66Mb L: 26/37 MS: 1 ChangeBinInt- 00:07:18.266 [2024-07-12 21:29:56.840196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:8c8cde8c cdw11:8c8c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.840221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.266 #37 NEW cov: 11747 ft: 14092 corp: 10/188b lim: 45 exec/s: 0 rss: 66Mb L: 9/37 MS: 5 CrossOver-InsertRepeatedBytes-CopyPart-ChangeBinInt-CopyPart- 00:07:18.266 [2024-07-12 21:29:56.880638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.880665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:56.880784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.880803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.266 #38 NEW cov: 11747 ft: 14138 corp: 11/206b lim: 45 exec/s: 0 rss: 66Mb L: 18/37 MS: 1 CopyPart- 00:07:18.266 [2024-07-12 21:29:56.921509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.921535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:56.921609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.921625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:56.921742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.921758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:56.921877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.921891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:56.922016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.922031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.266 #39 NEW cov: 11747 ft: 14232 corp: 12/251b lim: 45 exec/s: 0 rss: 66Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:18.266 [2024-07-12 21:29:56.970901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:8c8cde8c cdw11:8c8c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.970928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:56.971043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:56.971059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.266 #40 NEW cov: 11747 ft: 14259 corp: 13/269b lim: 45 exec/s: 0 rss: 66Mb L: 18/45 MS: 1 InsertRepeatedBytes- 00:07:18.266 [2024-07-12 21:29:57.011011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:57.011038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.266 [2024-07-12 21:29:57.011151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:08280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.266 [2024-07-12 21:29:57.011169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.266 #41 NEW cov: 11747 ft: 14271 corp: 14/295b lim: 45 exec/s: 0 rss: 67Mb L: 26/45 MS: 1 ChangeBit- 00:07:18.525 [2024-07-12 21:29:57.050883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.050910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.525 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:18.525 #42 NEW cov: 11770 ft: 14305 corp: 15/310b lim: 45 exec/s: 0 rss: 67Mb L: 15/45 MS: 1 ChangeBinInt- 00:07:18.525 [2024-07-12 21:29:57.091235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:8c8cde8c cdw11:8c8c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.091262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.525 [2024-07-12 21:29:57.091381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.091396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.525 #43 NEW cov: 11770 ft: 14336 corp: 16/328b lim: 45 exec/s: 0 rss: 67Mb L: 18/45 MS: 1 ChangeBit- 00:07:18.525 [2024-07-12 21:29:57.131907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.131933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.525 [2024-07-12 21:29:57.132050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.132066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.525 [2024-07-12 21:29:57.132174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.132189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.525 [2024-07-12 21:29:57.132306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.132323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.525 #44 NEW cov: 11770 ft: 14370 corp: 17/365b lim: 45 exec/s: 0 rss: 67Mb L: 37/45 MS: 1 ShuffleBytes- 00:07:18.525 [2024-07-12 21:29:57.171911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d7d8f5d7 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.171937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.525 [2024-07-12 21:29:57.172058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.172075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.525 [2024-07-12 21:29:57.172187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.172204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.525 [2024-07-12 21:29:57.172309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.525 [2024-07-12 21:29:57.172326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.525 #45 NEW cov: 11770 ft: 14403 corp: 18/402b lim: 45 exec/s: 45 rss: 67Mb L: 37/45 MS: 1 ChangeBinInt- 00:07:18.526 [2024-07-12 21:29:57.232290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d7d8f5d7 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.526 [2024-07-12 21:29:57.232319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.526 [2024-07-12 21:29:57.232451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.526 [2024-07-12 21:29:57.232468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.526 [2024-07-12 21:29:57.232597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28000028 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.526 [2024-07-12 21:29:57.232613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.526 [2024-07-12 21:29:57.232736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.526 [2024-07-12 21:29:57.232753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.526 #46 NEW cov: 11770 ft: 14447 corp: 19/446b lim: 45 exec/s: 46 rss: 67Mb L: 44/45 MS: 1 CopyPart- 00:07:18.526 [2024-07-12 21:29:57.281849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d7d8f5d7 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.526 [2024-07-12 21:29:57.281878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.526 [2024-07-12 21:29:57.281997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.526 [2024-07-12 21:29:57.282013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.526 #47 NEW cov: 11770 ft: 14482 corp: 20/466b lim: 45 exec/s: 47 rss: 67Mb L: 20/45 MS: 1 EraseBytes- 00:07:18.785 [2024-07-12 21:29:57.321268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d7d728d8 cdw11:d0280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.785 [2024-07-12 21:29:57.321296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.785 #48 NEW cov: 11770 ft: 14524 corp: 21/481b lim: 45 exec/s: 48 rss: 67Mb L: 15/45 MS: 1 ChangeBinInt- 00:07:18.785 [2024-07-12 21:29:57.361733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d7d728d8 cdw11:d0280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.785 [2024-07-12 21:29:57.361761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.785 [2024-07-12 21:29:57.361873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.785 [2024-07-12 21:29:57.361891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.785 #49 NEW cov: 11770 ft: 14568 corp: 22/504b lim: 45 exec/s: 49 rss: 67Mb L: 23/45 MS: 1 CrossOver- 00:07:18.785 [2024-07-12 21:29:57.412210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:001a0000 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.785 [2024-07-12 21:29:57.412237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.785 [2024-07-12 21:29:57.412357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.785 [2024-07-12 21:29:57.412377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.785 #50 NEW cov: 11770 ft: 14623 corp: 23/530b lim: 45 exec/s: 50 rss: 67Mb L: 26/45 MS: 1 ChangeBinInt- 00:07:18.785 [2024-07-12 21:29:57.453015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d7d8f5d7 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.785 [2024-07-12 21:29:57.453043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.785 [2024-07-12 21:29:57.453159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.453178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.786 [2024-07-12 21:29:57.453260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.453276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.786 [2024-07-12 21:29:57.453396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.453413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.786 #51 NEW cov: 11770 ft: 14632 corp: 24/567b lim: 45 exec/s: 51 rss: 67Mb L: 37/45 MS: 1 ShuffleBytes- 00:07:18.786 [2024-07-12 21:29:57.493006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.493034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.786 [2024-07-12 21:29:57.493164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.493181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.786 [2024-07-12 21:29:57.493301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.493317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.786 [2024-07-12 21:29:57.493435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.493455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.786 #52 NEW cov: 11770 ft: 14641 corp: 25/604b lim: 45 exec/s: 52 rss: 67Mb L: 37/45 MS: 1 InsertRepeatedBytes- 00:07:18.786 [2024-07-12 21:29:57.552371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2e28a523 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:18.786 [2024-07-12 21:29:57.552399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.045 #53 NEW cov: 11770 ft: 14696 corp: 26/616b lim: 45 exec/s: 53 rss: 67Mb L: 12/45 MS: 1 ChangeBinInt- 00:07:19.045 [2024-07-12 21:29:57.602364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.602391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.045 [2024-07-12 21:29:57.602502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:08280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.602520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.045 #54 NEW cov: 11770 ft: 14730 corp: 27/642b lim: 45 exec/s: 54 rss: 67Mb L: 26/45 MS: 1 ChangeBit- 00:07:19.045 [2024-07-12 21:29:57.643415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.643446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.045 [2024-07-12 21:29:57.643574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.643593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.045 [2024-07-12 21:29:57.643705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28000028 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.643722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.045 [2024-07-12 21:29:57.643845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.643861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.045 #55 NEW cov: 11770 ft: 14803 corp: 28/686b lim: 45 exec/s: 55 rss: 68Mb L: 44/45 MS: 1 CopyPart- 00:07:19.045 [2024-07-12 21:29:57.693418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d7d8f5d7 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.693448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.045 [2024-07-12 21:29:57.693576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.693591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.045 [2024-07-12 21:29:57.693715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:b1280028 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.693731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.045 [2024-07-12 21:29:57.693848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.693863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.045 #56 NEW cov: 11770 ft: 14810 corp: 29/724b lim: 45 exec/s: 56 rss: 68Mb L: 38/45 MS: 1 InsertByte- 00:07:19.045 [2024-07-12 21:29:57.733383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2828600b cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.045 [2024-07-12 21:29:57.733410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.733546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.733562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.733680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.733696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.046 #57 NEW cov: 11770 ft: 15064 corp: 30/751b lim: 45 exec/s: 57 rss: 68Mb L: 27/45 MS: 1 InsertByte- 00:07:19.046 [2024-07-12 21:29:57.773738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28287728 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.773764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.773884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.773901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.774025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:77777777 cdw11:77770003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.774040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.774164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.774181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.046 #59 NEW cov: 11770 ft: 15074 corp: 31/787b lim: 45 exec/s: 59 rss: 68Mb L: 36/45 MS: 2 CrossOver-CrossOver- 00:07:19.046 [2024-07-12 21:29:57.813951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.813978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.814094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.814112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.814232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28000028 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.814249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.046 [2024-07-12 21:29:57.814363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:282c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.046 [2024-07-12 21:29:57.814379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.305 #60 NEW cov: 11770 ft: 15075 corp: 32/831b lim: 45 exec/s: 60 rss: 68Mb L: 44/45 MS: 1 ChangeByte- 00:07:19.305 [2024-07-12 21:29:57.863502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.863528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.863648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28280028 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.863676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.305 #61 NEW cov: 11770 ft: 15083 corp: 33/856b lim: 45 exec/s: 61 rss: 68Mb L: 25/45 MS: 1 EraseBytes- 00:07:19.305 [2024-07-12 21:29:57.904122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.904148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.904284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.904301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.904378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28000028 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.904394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.904509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.904526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.305 #62 NEW cov: 11770 ft: 15089 corp: 34/900b lim: 45 exec/s: 62 rss: 68Mb L: 44/45 MS: 1 ShuffleBytes- 00:07:19.305 [2024-07-12 21:29:57.943852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:8c8fde8c cdw11:8c8c0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.943879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.944004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.944020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.305 #63 NEW cov: 11770 ft: 15093 corp: 35/918b lim: 45 exec/s: 63 rss: 68Mb L: 18/45 MS: 1 ChangeBinInt- 00:07:19.305 [2024-07-12 21:29:57.984484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.984513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.984632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00002828 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.984649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.984763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.984778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:57.984898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:57.984912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.305 #64 NEW cov: 11770 ft: 15098 corp: 36/958b lim: 45 exec/s: 64 rss: 68Mb L: 40/45 MS: 1 CopyPart- 00:07:19.305 [2024-07-12 21:29:58.024251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28280b28 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:58.024276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:58.024397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28002828 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:58.024413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:58.024548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28280028 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:58.024564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.305 #65 NEW cov: 11770 ft: 15099 corp: 37/991b lim: 45 exec/s: 65 rss: 68Mb L: 33/45 MS: 1 InsertRepeatedBytes- 00:07:19.305 [2024-07-12 21:29:58.064461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:58.064488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:58.064604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.305 [2024-07-12 21:29:58.064621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.305 [2024-07-12 21:29:58.064734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.306 [2024-07-12 21:29:58.064751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.306 #66 NEW cov: 11770 ft: 15127 corp: 38/1023b lim: 45 exec/s: 66 rss: 68Mb L: 32/45 MS: 1 InsertRepeatedBytes- 00:07:19.565 [2024-07-12 21:29:58.104257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2828600b cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.565 [2024-07-12 21:29:58.104284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.565 [2024-07-12 21:29:58.104405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.565 [2024-07-12 21:29:58.104421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.565 #67 NEW cov: 11770 ft: 15137 corp: 39/1042b lim: 45 exec/s: 67 rss: 68Mb L: 19/45 MS: 1 CrossOver- 00:07:19.565 [2024-07-12 21:29:58.145000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.565 [2024-07-12 21:29:58.145026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.565 [2024-07-12 21:29:58.145134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.565 [2024-07-12 21:29:58.145161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.565 [2024-07-12 21:29:58.145277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:28000028 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.565 [2024-07-12 21:29:58.145291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.565 [2024-07-12 21:29:58.145411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:28282828 cdw11:28280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.565 [2024-07-12 21:29:58.145428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.565 #68 NEW cov: 11770 ft: 15146 corp: 40/1086b lim: 45 exec/s: 68 rss: 68Mb L: 44/45 MS: 1 ShuffleBytes- 00:07:19.565 [2024-07-12 21:29:58.184246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:28282828 cdw11:2b280001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.565 [2024-07-12 21:29:58.184276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.565 #69 NEW cov: 11770 ft: 15212 corp: 41/1102b lim: 45 exec/s: 34 rss: 68Mb L: 16/45 MS: 1 InsertByte- 00:07:19.565 #69 DONE cov: 11770 ft: 15212 corp: 41/1102b lim: 45 exec/s: 34 rss: 68Mb 00:07:19.565 Done 69 runs in 2 second(s) 00:07:19.565 21:29:58 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:19.565 21:29:58 -- ../common.sh@72 -- # (( i++ )) 00:07:19.565 21:29:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:19.565 21:29:58 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:19.565 21:29:58 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:19.565 21:29:58 -- nvmf/run.sh@24 -- # local timen=1 00:07:19.565 21:29:58 -- nvmf/run.sh@25 -- # local core=0x1 00:07:19.565 21:29:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:19.565 21:29:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:19.565 21:29:58 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:19.565 21:29:58 -- nvmf/run.sh@29 -- # port=4406 00:07:19.565 21:29:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:19.565 21:29:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:19.565 21:29:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:19.823 21:29:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:19.823 [2024-07-12 21:29:58.375534] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:19.823 [2024-07-12 21:29:58.375599] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3579685 ] 00:07:19.823 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.823 [2024-07-12 21:29:58.558949] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.109 [2024-07-12 21:29:58.626236] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:20.109 [2024-07-12 21:29:58.626377] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.109 [2024-07-12 21:29:58.684798] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:20.109 [2024-07-12 21:29:58.701064] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:20.109 INFO: Running with entropic power schedule (0xFF, 100). 00:07:20.109 INFO: Seed: 4155367899 00:07:20.109 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:20.109 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:20.109 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:20.109 INFO: A corpus is not provided, starting from an empty corpus 00:07:20.109 #2 INITED exec/s: 0 rss: 61Mb 00:07:20.109 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:20.109 This may also happen if the target rejected all inputs we tried so far 00:07:20.109 [2024-07-12 21:29:58.767054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:07:20.109 [2024-07-12 21:29:58.767087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.368 NEW_FUNC[1/669]: 0x48b790 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:20.368 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:20.368 #4 NEW cov: 11460 ft: 11461 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 2 ShuffleBytes-InsertByte- 00:07:20.368 [2024-07-12 21:29:59.108390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:07:20.368 [2024-07-12 21:29:59.108459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.368 [2024-07-12 21:29:59.108612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000108 cdw11:00000000 00:07:20.368 [2024-07-12 21:29:59.108639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.368 #5 NEW cov: 11573 ft: 12139 corp: 3/7b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 CMP- DE: "\001\010"- 00:07:20.626 [2024-07-12 21:29:59.158006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a04 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.158034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.627 #8 NEW cov: 11579 ft: 12468 corp: 4/9b lim: 10 exec/s: 0 rss: 67Mb L: 2/4 MS: 3 CopyPart-ChangeBit-InsertByte- 00:07:20.627 [2024-07-12 21:29:59.198261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.198291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.627 [2024-07-12 21:29:59.198405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000108 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.198423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.627 #9 NEW cov: 11664 ft: 12677 corp: 5/13b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeASCIIInt- 00:07:20.627 [2024-07-12 21:29:59.248512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.248537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.627 [2024-07-12 21:29:59.248653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.248668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.627 #10 NEW cov: 11664 ft: 12864 corp: 6/18b lim: 10 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CopyPart- 00:07:20.627 [2024-07-12 21:29:59.288398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.288423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.627 #12 NEW cov: 11664 ft: 12967 corp: 7/20b lim: 10 exec/s: 0 rss: 68Mb L: 2/5 MS: 2 ShuffleBytes-CopyPart- 00:07:20.627 [2024-07-12 21:29:59.318486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.318513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.627 #13 NEW cov: 11664 ft: 13018 corp: 8/22b lim: 10 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:20.627 [2024-07-12 21:29:59.359448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.359473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.627 [2024-07-12 21:29:59.359591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.359608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.627 [2024-07-12 21:29:59.359725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.359740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.627 [2024-07-12 21:29:59.359853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.359870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.627 [2024-07-12 21:29:59.359981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000a10a cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.359997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.627 #14 NEW cov: 11664 ft: 13331 corp: 9/32b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:20.627 [2024-07-12 21:29:59.398892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.398918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.627 [2024-07-12 21:29:59.399028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000108 cdw11:00000000 00:07:20.627 [2024-07-12 21:29:59.399044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.886 #15 NEW cov: 11664 ft: 13348 corp: 10/36b lim: 10 exec/s: 0 rss: 68Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:20.886 [2024-07-12 21:29:59.428834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.428860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.886 #16 NEW cov: 11664 ft: 13381 corp: 11/38b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 CopyPart- 00:07:20.886 [2024-07-12 21:29:59.469310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.469336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.469461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.469479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.469583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000b08 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.469598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.886 #17 NEW cov: 11664 ft: 13537 corp: 12/44b lim: 10 exec/s: 0 rss: 68Mb L: 6/10 MS: 1 CrossOver- 00:07:20.886 [2024-07-12 21:29:59.509915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.509939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.510052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.510068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.510181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a19b cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.510196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.510311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.510327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.510437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000a10a cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.510456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:20.886 #18 NEW cov: 11664 ft: 13578 corp: 13/54b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:20.886 [2024-07-12 21:29:59.549251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a32 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.549278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.886 #19 NEW cov: 11664 ft: 13683 corp: 14/57b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 1 InsertByte- 00:07:20.886 [2024-07-12 21:29:59.589308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.589333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.886 #21 NEW cov: 11664 ft: 13696 corp: 15/60b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 2 ShuffleBytes-PersAutoDict- DE: "\001\010"- 00:07:20.886 [2024-07-12 21:29:59.619405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.619430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.886 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:20.886 #22 NEW cov: 11687 ft: 13727 corp: 16/63b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 1 InsertByte- 00:07:20.886 [2024-07-12 21:29:59.659962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.659989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.660106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.660123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:20.886 [2024-07-12 21:29:59.660231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:20.886 [2024-07-12 21:29:59.660246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.145 #23 NEW cov: 11687 ft: 13741 corp: 17/69b lim: 10 exec/s: 0 rss: 68Mb L: 6/10 MS: 1 CMP- DE: "\377\377\377\000"- 00:07:21.145 [2024-07-12 21:29:59.699708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a1e cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.699735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.145 #25 NEW cov: 11687 ft: 13766 corp: 18/72b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 2 EraseBytes-CMP- DE: "\036\000"- 00:07:21.145 [2024-07-12 21:29:59.739948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.739976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.145 [2024-07-12 21:29:59.740095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000013d cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.740111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.145 #26 NEW cov: 11687 ft: 13801 corp: 19/76b lim: 10 exec/s: 26 rss: 69Mb L: 4/10 MS: 1 ChangeByte- 00:07:21.145 [2024-07-12 21:29:59.780073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e31 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.780101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.145 [2024-07-12 21:29:59.780224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000108 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.780242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.145 #27 NEW cov: 11687 ft: 13839 corp: 20/80b lim: 10 exec/s: 27 rss: 69Mb L: 4/10 MS: 1 ChangeBit- 00:07:21.145 [2024-07-12 21:29:59.820196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.820221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.145 [2024-07-12 21:29:59.820333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.820349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.145 #28 NEW cov: 11687 ft: 13846 corp: 21/84b lim: 10 exec/s: 28 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:07:21.145 [2024-07-12 21:29:59.860353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.860381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.145 [2024-07-12 21:29:59.860512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000801 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.860530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.145 #29 NEW cov: 11687 ft: 13863 corp: 22/89b lim: 10 exec/s: 29 rss: 69Mb L: 5/10 MS: 1 PersAutoDict- DE: "\001\010"- 00:07:21.145 [2024-07-12 21:29:59.900690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.900716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.145 [2024-07-12 21:29:59.900825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.900843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.145 [2024-07-12 21:29:59.900941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000b08 cdw11:00000000 00:07:21.145 [2024-07-12 21:29:59.900958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.145 #30 NEW cov: 11687 ft: 13883 corp: 23/95b lim: 10 exec/s: 30 rss: 69Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:21.404 [2024-07-12 21:29:59.940725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001e00 cdw11:00000000 00:07:21.404 [2024-07-12 21:29:59.940752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.404 [2024-07-12 21:29:59.940871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a31 cdw11:00000000 00:07:21.404 [2024-07-12 21:29:59.940886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.404 #31 NEW cov: 11687 ft: 13887 corp: 24/99b lim: 10 exec/s: 31 rss: 69Mb L: 4/10 MS: 1 PersAutoDict- DE: "\036\000"- 00:07:21.404 [2024-07-12 21:29:59.980578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000320b cdw11:00000000 00:07:21.404 [2024-07-12 21:29:59.980605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.404 #32 NEW cov: 11687 ft: 13911 corp: 25/102b lim: 10 exec/s: 32 rss: 69Mb L: 3/10 MS: 1 CopyPart- 00:07:21.404 [2024-07-12 21:30:00.021037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.021065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.404 [2024-07-12 21:30:00.021178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.021196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.404 [2024-07-12 21:30:00.021318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000b08 cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.021335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.404 #33 NEW cov: 11687 ft: 13918 corp: 26/108b lim: 10 exec/s: 33 rss: 69Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:21.404 [2024-07-12 21:30:00.061031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009ba1 cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.061061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.404 [2024-07-12 21:30:00.061180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.061200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.404 #34 NEW cov: 11687 ft: 13934 corp: 27/113b lim: 10 exec/s: 34 rss: 69Mb L: 5/10 MS: 1 EraseBytes- 00:07:21.404 [2024-07-12 21:30:00.100910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e31 cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.100938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.404 #35 NEW cov: 11687 ft: 13937 corp: 28/115b lim: 10 exec/s: 35 rss: 69Mb L: 2/10 MS: 1 EraseBytes- 00:07:21.404 [2024-07-12 21:30:00.141097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.141125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.404 #37 NEW cov: 11687 ft: 13961 corp: 29/117b lim: 10 exec/s: 37 rss: 69Mb L: 2/10 MS: 2 EraseBytes-CopyPart- 00:07:21.404 [2024-07-12 21:30:00.181164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a1d cdw11:00000000 00:07:21.404 [2024-07-12 21:30:00.181190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.663 #38 NEW cov: 11687 ft: 13970 corp: 30/120b lim: 10 exec/s: 38 rss: 69Mb L: 3/10 MS: 1 ChangeBinInt- 00:07:21.663 [2024-07-12 21:30:00.221331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000b32 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.221358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.663 #39 NEW cov: 11687 ft: 13997 corp: 31/123b lim: 10 exec/s: 39 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:21.663 [2024-07-12 21:30:00.262235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.262262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.262381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005c5e cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.262397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.262515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005e5e cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.262532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.262649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005e5e cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.262666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.262786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00005ef5 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.262802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:21.663 #40 NEW cov: 11687 ft: 14003 corp: 32/133b lim: 10 exec/s: 40 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:21.663 [2024-07-12 21:30:00.302197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aa1 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.302225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.302346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000aa1 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.302363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.302474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.302491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.302565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000a1a1 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.302580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.663 #41 NEW cov: 11687 ft: 14018 corp: 33/141b lim: 10 exec/s: 41 rss: 70Mb L: 8/10 MS: 1 CrossOver- 00:07:21.663 [2024-07-12 21:30:00.352358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.352384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.352498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.352514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.352627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.352644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.663 [2024-07-12 21:30:00.352754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.352771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:21.663 #42 NEW cov: 11687 ft: 14089 corp: 34/149b lim: 10 exec/s: 42 rss: 70Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:21.663 [2024-07-12 21:30:00.401865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000afe cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.401896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.663 #43 NEW cov: 11687 ft: 14107 corp: 35/151b lim: 10 exec/s: 43 rss: 70Mb L: 2/10 MS: 1 EraseBytes- 00:07:21.663 [2024-07-12 21:30:00.441972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000320b cdw11:00000000 00:07:21.663 [2024-07-12 21:30:00.442000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.922 #44 NEW cov: 11687 ft: 14157 corp: 36/154b lim: 10 exec/s: 44 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:21.922 [2024-07-12 21:30:00.482595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.482622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.922 [2024-07-12 21:30:00.482732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000108 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.482748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.922 [2024-07-12 21:30:00.482863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000780a cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.482881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.922 #45 NEW cov: 11687 ft: 14165 corp: 37/160b lim: 10 exec/s: 45 rss: 70Mb L: 6/10 MS: 1 CMP- DE: "x\012"- 00:07:21.922 [2024-07-12 21:30:00.522465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000801 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.522493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.922 [2024-07-12 21:30:00.522607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000801 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.522624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.922 #46 NEW cov: 11687 ft: 14170 corp: 38/165b lim: 10 exec/s: 46 rss: 70Mb L: 5/10 MS: 1 ChangeBit- 00:07:21.922 [2024-07-12 21:30:00.562537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000f0a cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.562562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.922 [2024-07-12 21:30:00.562669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001e00 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.562687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.922 #47 NEW cov: 11687 ft: 14184 corp: 39/169b lim: 10 exec/s: 47 rss: 70Mb L: 4/10 MS: 1 InsertByte- 00:07:21.922 [2024-07-12 21:30:00.602493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a48 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.602544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.922 #48 NEW cov: 11687 ft: 14208 corp: 40/171b lim: 10 exec/s: 48 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:21.922 [2024-07-12 21:30:00.643035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000b32 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.643062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.922 [2024-07-12 21:30:00.643179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.643198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:21.922 [2024-07-12 21:30:00.643319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.643337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:21.922 #49 NEW cov: 11687 ft: 14213 corp: 41/178b lim: 10 exec/s: 49 rss: 70Mb L: 7/10 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:07:21.922 [2024-07-12 21:30:00.692928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a1e cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.692954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:21.922 [2024-07-12 21:30:00.693069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000001d cdw11:00000000 00:07:21.922 [2024-07-12 21:30:00.693085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.181 #50 NEW cov: 11687 ft: 14222 corp: 42/183b lim: 10 exec/s: 50 rss: 70Mb L: 5/10 MS: 1 PersAutoDict- DE: "\036\000"- 00:07:22.181 [2024-07-12 21:30:00.733125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001e00 cdw11:00000000 00:07:22.181 [2024-07-12 21:30:00.733154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.181 [2024-07-12 21:30:00.733265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a31 cdw11:00000000 00:07:22.181 [2024-07-12 21:30:00.733280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.181 #51 NEW cov: 11687 ft: 14225 corp: 43/187b lim: 10 exec/s: 25 rss: 70Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:22.181 #51 DONE cov: 11687 ft: 14225 corp: 43/187b lim: 10 exec/s: 25 rss: 70Mb 00:07:22.181 ###### Recommended dictionary. ###### 00:07:22.181 "\001\010" # Uses: 2 00:07:22.181 "\377\377\377\000" # Uses: 1 00:07:22.181 "\036\000" # Uses: 2 00:07:22.181 "x\012" # Uses: 0 00:07:22.181 ###### End of recommended dictionary. ###### 00:07:22.181 Done 51 runs in 2 second(s) 00:07:22.181 21:30:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:22.181 21:30:00 -- ../common.sh@72 -- # (( i++ )) 00:07:22.181 21:30:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:22.181 21:30:00 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:22.181 21:30:00 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:22.181 21:30:00 -- nvmf/run.sh@24 -- # local timen=1 00:07:22.181 21:30:00 -- nvmf/run.sh@25 -- # local core=0x1 00:07:22.181 21:30:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:22.181 21:30:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:22.181 21:30:00 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:22.181 21:30:00 -- nvmf/run.sh@29 -- # port=4407 00:07:22.181 21:30:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:22.181 21:30:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:22.181 21:30:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:22.181 21:30:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:22.181 [2024-07-12 21:30:00.926503] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:22.181 [2024-07-12 21:30:00.926572] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3580294 ] 00:07:22.181 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.440 [2024-07-12 21:30:01.119803] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.440 [2024-07-12 21:30:01.183740] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:22.440 [2024-07-12 21:30:01.183864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.699 [2024-07-12 21:30:01.241864] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:22.699 [2024-07-12 21:30:01.258159] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:22.699 INFO: Running with entropic power schedule (0xFF, 100). 00:07:22.699 INFO: Seed: 2417383454 00:07:22.699 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:22.699 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:22.699 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:22.699 INFO: A corpus is not provided, starting from an empty corpus 00:07:22.699 #2 INITED exec/s: 0 rss: 61Mb 00:07:22.699 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:22.699 This may also happen if the target rejected all inputs we tried so far 00:07:22.699 [2024-07-12 21:30:01.302970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:22.699 [2024-07-12 21:30:01.303005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.699 [2024-07-12 21:30:01.303037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:22.699 [2024-07-12 21:30:01.303053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:22.699 [2024-07-12 21:30:01.303097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:22.699 [2024-07-12 21:30:01.303114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:22.699 [2024-07-12 21:30:01.303141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:22.699 [2024-07-12 21:30:01.303157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:22.699 [2024-07-12 21:30:01.303184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:22.699 [2024-07-12 21:30:01.303200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:22.958 NEW_FUNC[1/669]: 0x48c180 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:22.958 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:22.958 #3 NEW cov: 11460 ft: 11458 corp: 2/11b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:22.958 [2024-07-12 21:30:01.665162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:22.958 [2024-07-12 21:30:01.665214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.958 #5 NEW cov: 11573 ft: 12400 corp: 3/13b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 2 ChangeBit-CopyPart- 00:07:22.958 [2024-07-12 21:30:01.704802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:22.958 [2024-07-12 21:30:01.704829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.958 #6 NEW cov: 11579 ft: 12719 corp: 4/15b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 CopyPart- 00:07:23.217 [2024-07-12 21:30:01.755820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000096c2 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.755851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.755976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.755992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.756103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.756119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.756232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.756250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.217 #8 NEW cov: 11664 ft: 13079 corp: 5/24b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:23.217 [2024-07-12 21:30:01.795519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b70e cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.795548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.795670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000e24 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.795687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.795804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e47f cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.795820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.795936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.795953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.217 #9 NEW cov: 11664 ft: 13146 corp: 6/33b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 CMP- DE: "\267\016\016$\344\177\000\000"- 00:07:23.217 [2024-07-12 21:30:01.855493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000f0b cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.855521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.217 #10 NEW cov: 11664 ft: 13383 corp: 7/35b lim: 10 exec/s: 0 rss: 67Mb L: 2/10 MS: 1 ChangeBit- 00:07:23.217 [2024-07-12 21:30:01.906032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b70e cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.906061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.906175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000e0f cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.906192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.906308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000024e4 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.906325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.906437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.906463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.217 [2024-07-12 21:30:01.906576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.906593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.217 #11 NEW cov: 11664 ft: 13459 corp: 8/45b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:07:23.217 [2024-07-12 21:30:01.955422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0f cdw11:00000000 00:07:23.217 [2024-07-12 21:30:01.955455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.217 #12 NEW cov: 11664 ft: 13513 corp: 9/47b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:23.476 [2024-07-12 21:30:02.006579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b7e4 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.006608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.006722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.006739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.006854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.006871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.006983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.007000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.476 #13 NEW cov: 11664 ft: 13552 corp: 10/56b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 CrossOver- 00:07:23.476 [2024-07-12 21:30:02.056652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.056683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.056796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.056813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.056934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.056951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.057062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.057079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.476 #14 NEW cov: 11664 ft: 13584 corp: 11/64b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 CrossOver- 00:07:23.476 [2024-07-12 21:30:02.106184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.106214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.476 #15 NEW cov: 11664 ft: 13642 corp: 12/66b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 CopyPart- 00:07:23.476 [2024-07-12 21:30:02.146333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.146361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.146482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.146499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.146613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.146630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.476 #16 NEW cov: 11664 ft: 13838 corp: 13/73b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 EraseBytes- 00:07:23.476 [2024-07-12 21:30:02.206502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000f0b cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.206530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.476 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:23.476 #17 NEW cov: 11681 ft: 13889 corp: 14/75b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:23.476 [2024-07-12 21:30:02.256973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b700 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.257000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.257111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e47f cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.257128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.257237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.257254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.257365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c200 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.257382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.476 [2024-07-12 21:30:02.257494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.476 [2024-07-12 21:30:02.257511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.734 #18 NEW cov: 11681 ft: 13907 corp: 15/85b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:23.734 [2024-07-12 21:30:02.307574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.307598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.307729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000e0f cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.307745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.307861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000024e4 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.307876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.307999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.308016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.308134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.308150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.734 #19 NEW cov: 11681 ft: 13922 corp: 16/95b lim: 10 exec/s: 19 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:23.734 [2024-07-12 21:30:02.347055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b700 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.347082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.347191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e47f cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.347206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.347317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000b70e cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.347334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.347445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.347462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.734 #20 NEW cov: 11681 ft: 13989 corp: 17/103b lim: 10 exec/s: 20 rss: 69Mb L: 8/10 MS: 1 CrossOver- 00:07:23.734 [2024-07-12 21:30:02.387815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b70e cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.387841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.387956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000e0f cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.387971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.388080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002400 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.388095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.388185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.388202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.388320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.388337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.734 #21 NEW cov: 11681 ft: 14014 corp: 18/113b lim: 10 exec/s: 21 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:07:23.734 [2024-07-12 21:30:02.427620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b0b cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.427646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.734 [2024-07-12 21:30:02.427767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.734 [2024-07-12 21:30:02.427787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.735 [2024-07-12 21:30:02.427894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.427912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.735 [2024-07-12 21:30:02.428024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.428040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.735 #22 NEW cov: 11681 ft: 14030 corp: 19/121b lim: 10 exec/s: 22 rss: 69Mb L: 8/10 MS: 1 ChangeBit- 00:07:23.735 [2024-07-12 21:30:02.467194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b3d cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.467221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.735 #23 NEW cov: 11681 ft: 14120 corp: 20/123b lim: 10 exec/s: 23 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:07:23.735 [2024-07-12 21:30:02.508241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.508268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.735 [2024-07-12 21:30:02.508379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.508396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.735 [2024-07-12 21:30:02.508520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.508538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.735 [2024-07-12 21:30:02.508654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.508671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.735 [2024-07-12 21:30:02.508782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.735 [2024-07-12 21:30:02.508796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.993 #24 NEW cov: 11681 ft: 14135 corp: 21/133b lim: 10 exec/s: 24 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:23.993 [2024-07-12 21:30:02.547899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000096c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.547925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.548034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.548053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.548174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.548192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.993 #25 NEW cov: 11681 ft: 14143 corp: 22/140b lim: 10 exec/s: 25 rss: 69Mb L: 7/10 MS: 1 EraseBytes- 00:07:23.993 [2024-07-12 21:30:02.587825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.587856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.587975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffc0 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.587993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.588111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.588127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.588249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.588267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.993 #26 NEW cov: 11681 ft: 14156 corp: 23/149b lim: 10 exec/s: 26 rss: 69Mb L: 9/10 MS: 1 InsertByte- 00:07:23.993 [2024-07-12 21:30:02.628607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b700 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.628633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.628753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e47f cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.628771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.628885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.628900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.629022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c200 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.629038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.629155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.629172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.993 #27 NEW cov: 11681 ft: 14164 corp: 24/159b lim: 10 exec/s: 27 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:23.993 [2024-07-12 21:30:02.668157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.668184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.668298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b70e cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.668314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.668422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000e24 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.668438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.668549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000e47f cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.668566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.668670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.668687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.993 #28 NEW cov: 11681 ft: 14218 corp: 25/169b lim: 10 exec/s: 28 rss: 69Mb L: 10/10 MS: 1 PersAutoDict- DE: "\267\016\016$\344\177\000\000"- 00:07:23.993 [2024-07-12 21:30:02.708374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.708401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.708519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000247f cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.708536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.708649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000e0f cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.708666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.708782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000e400 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.708798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.708912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.708930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:23.993 #29 NEW cov: 11681 ft: 14313 corp: 26/179b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:23.993 [2024-07-12 21:30:02.748518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000096c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.748545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.748676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.748694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.993 [2024-07-12 21:30:02.748813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000039c2 cdw11:00000000 00:07:23.993 [2024-07-12 21:30:02.748829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.993 #30 NEW cov: 11681 ft: 14325 corp: 27/186b lim: 10 exec/s: 30 rss: 69Mb L: 7/10 MS: 1 ChangeBinInt- 00:07:24.252 [2024-07-12 21:30:02.788896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b700 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.788923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.789054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f724 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.789072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.789187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e47f cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.789203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.789323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.789343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.252 #31 NEW cov: 11681 ft: 14338 corp: 28/195b lim: 10 exec/s: 31 rss: 69Mb L: 9/10 MS: 1 CMP- DE: "\000\367"- 00:07:24.252 [2024-07-12 21:30:02.828776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000096c2 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.828802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.828911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c23f cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.828926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.829040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003dc2 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.829055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.829176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.829192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.252 #32 NEW cov: 11681 ft: 14350 corp: 29/204b lim: 10 exec/s: 32 rss: 69Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:24.252 [2024-07-12 21:30:02.879111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b700 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.879139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.879256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f724 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.879273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.879378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000407f cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.879393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.879509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.879524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.252 #33 NEW cov: 11681 ft: 14358 corp: 30/213b lim: 10 exec/s: 33 rss: 69Mb L: 9/10 MS: 1 ChangeByte- 00:07:24.252 [2024-07-12 21:30:02.919480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b700 cdw11:00000000 00:07:24.252 [2024-07-12 21:30:02.919508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.252 [2024-07-12 21:30:02.919634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e47f cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.919652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.919773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.919788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.919910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00003100 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.919930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.920047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.920064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.253 #34 NEW cov: 11681 ft: 14378 corp: 31/223b lim: 10 exec/s: 34 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:24.253 [2024-07-12 21:30:02.959373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.959400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.959514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000180f cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.959530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.959643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000024e4 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.959657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.959770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.959786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.959901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.959917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.253 #35 NEW cov: 11681 ft: 14387 corp: 32/233b lim: 10 exec/s: 35 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:24.253 [2024-07-12 21:30:02.999268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff27 cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.999294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.999423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.999438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.253 [2024-07-12 21:30:02.999555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:24.253 [2024-07-12 21:30:02.999570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.253 #36 NEW cov: 11681 ft: 14392 corp: 33/240b lim: 10 exec/s: 36 rss: 70Mb L: 7/10 MS: 1 ChangeByte- 00:07:24.512 [2024-07-12 21:30:03.038536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b24 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.038560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.512 #37 NEW cov: 11681 ft: 14414 corp: 34/242b lim: 10 exec/s: 37 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:24.512 [2024-07-12 21:30:03.079681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000096c2 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.079707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.079831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c23f cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.079850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.079972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003dff cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.079990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.080112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.080129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.512 #38 NEW cov: 11681 ft: 14450 corp: 35/251b lim: 10 exec/s: 38 rss: 70Mb L: 9/10 MS: 1 CrossOver- 00:07:24.512 [2024-07-12 21:30:03.119208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000f3d cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.119235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.512 #39 NEW cov: 11681 ft: 14457 corp: 36/253b lim: 10 exec/s: 39 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:24.512 [2024-07-12 21:30:03.159874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.159901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.160008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000640f cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.160025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.160131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000024e4 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.160146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.160264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.160280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.160390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.160404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.512 #40 NEW cov: 11681 ft: 14524 corp: 37/263b lim: 10 exec/s: 40 rss: 70Mb L: 10/10 MS: 1 ChangeByte- 00:07:24.512 [2024-07-12 21:30:03.199800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.199825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.199945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.199961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.200069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.200087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.200197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffc2 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.200212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.200318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000c2c2 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.200332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.512 #41 NEW cov: 11688 ft: 14540 corp: 38/273b lim: 10 exec/s: 41 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:24.512 [2024-07-12 21:30:03.240410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.240436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.240556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b70e cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.240573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.240694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000e24 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.240710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.240832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000e400 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.240849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.240971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.240988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:24.512 #42 NEW cov: 11688 ft: 14546 corp: 39/283b lim: 10 exec/s: 42 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:24.512 [2024-07-12 21:30:03.279498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002400 cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.279524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.512 [2024-07-12 21:30:03.279636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 00:07:24.512 [2024-07-12 21:30:03.279654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.771 #43 NEW cov: 11688 ft: 14682 corp: 40/288b lim: 10 exec/s: 21 rss: 70Mb L: 5/10 MS: 1 CrossOver- 00:07:24.771 #43 DONE cov: 11688 ft: 14682 corp: 40/288b lim: 10 exec/s: 21 rss: 70Mb 00:07:24.771 ###### Recommended dictionary. ###### 00:07:24.771 "\267\016\016$\344\177\000\000" # Uses: 1 00:07:24.771 "\000\367" # Uses: 0 00:07:24.771 ###### End of recommended dictionary. ###### 00:07:24.771 Done 43 runs in 2 second(s) 00:07:24.771 21:30:03 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:24.771 21:30:03 -- ../common.sh@72 -- # (( i++ )) 00:07:24.771 21:30:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:24.771 21:30:03 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:24.771 21:30:03 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:24.771 21:30:03 -- nvmf/run.sh@24 -- # local timen=1 00:07:24.771 21:30:03 -- nvmf/run.sh@25 -- # local core=0x1 00:07:24.771 21:30:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:24.771 21:30:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:24.771 21:30:03 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:24.771 21:30:03 -- nvmf/run.sh@29 -- # port=4408 00:07:24.771 21:30:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:24.771 21:30:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:24.771 21:30:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:24.771 21:30:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:24.771 [2024-07-12 21:30:03.465115] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:24.771 [2024-07-12 21:30:03.465183] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3580720 ] 00:07:24.771 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.030 [2024-07-12 21:30:03.645161] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.030 [2024-07-12 21:30:03.709797] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:25.030 [2024-07-12 21:30:03.709936] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.030 [2024-07-12 21:30:03.768197] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:25.030 [2024-07-12 21:30:03.784479] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:25.030 INFO: Running with entropic power schedule (0xFF, 100). 00:07:25.030 INFO: Seed: 648428745 00:07:25.289 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:25.289 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:25.289 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:25.289 INFO: A corpus is not provided, starting from an empty corpus 00:07:25.289 [2024-07-12 21:30:03.829707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:03.829736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.289 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:25.289 [2024-07-12 21:30:03.859651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:03.859676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.289 #3 NEW cov: 11601 ft: 12055 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:25.289 [2024-07-12 21:30:03.899959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:03.899985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.289 [2024-07-12 21:30:03.900054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:03.900068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.289 #4 NEW cov: 11607 ft: 12922 corp: 3/4b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:25.289 [2024-07-12 21:30:03.939918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:03.939944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.289 #5 NEW cov: 11692 ft: 13110 corp: 4/5b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeBit- 00:07:25.289 [2024-07-12 21:30:03.979989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:03.980018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.289 #6 NEW cov: 11692 ft: 13245 corp: 5/6b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:25.289 [2024-07-12 21:30:04.020121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:04.020146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.289 #7 NEW cov: 11692 ft: 13298 corp: 6/7b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 CrossOver- 00:07:25.289 [2024-07-12 21:30:04.060236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.289 [2024-07-12 21:30:04.060261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.548 #8 NEW cov: 11692 ft: 13408 corp: 7/8b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:25.548 [2024-07-12 21:30:04.100484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.100509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.548 [2024-07-12 21:30:04.100580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.100594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.548 #9 NEW cov: 11692 ft: 13428 corp: 8/10b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:25.548 [2024-07-12 21:30:04.140633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.140658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.548 [2024-07-12 21:30:04.140710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.140724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.548 #10 NEW cov: 11692 ft: 13563 corp: 9/12b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:07:25.548 [2024-07-12 21:30:04.180738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.180763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.548 [2024-07-12 21:30:04.180833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.180847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.548 #11 NEW cov: 11692 ft: 13615 corp: 10/14b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:07:25.548 [2024-07-12 21:30:04.220865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.220890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.548 [2024-07-12 21:30:04.220942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.220958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.548 #12 NEW cov: 11692 ft: 13647 corp: 11/16b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:07:25.548 [2024-07-12 21:30:04.260974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.260999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.548 [2024-07-12 21:30:04.261077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.261091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.548 #13 NEW cov: 11692 ft: 13659 corp: 12/18b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:25.548 [2024-07-12 21:30:04.301256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.548 [2024-07-12 21:30:04.301281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.549 [2024-07-12 21:30:04.301354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.549 [2024-07-12 21:30:04.301368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.549 [2024-07-12 21:30:04.301425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.549 [2024-07-12 21:30:04.301438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.549 #14 NEW cov: 11692 ft: 13919 corp: 13/21b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:07:25.807 [2024-07-12 21:30:04.341219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.807 [2024-07-12 21:30:04.341244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.807 [2024-07-12 21:30:04.341301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.807 [2024-07-12 21:30:04.341315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.807 #15 NEW cov: 11692 ft: 13938 corp: 14/23b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CrossOver- 00:07:25.807 [2024-07-12 21:30:04.381324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.807 [2024-07-12 21:30:04.381349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.807 [2024-07-12 21:30:04.381404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.807 [2024-07-12 21:30:04.381418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.807 #16 NEW cov: 11692 ft: 13964 corp: 15/25b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 InsertByte- 00:07:25.808 [2024-07-12 21:30:04.421429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.421460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 [2024-07-12 21:30:04.421514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.421528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.808 #17 NEW cov: 11692 ft: 13969 corp: 16/27b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeByte- 00:07:25.808 [2024-07-12 21:30:04.461728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.461754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 [2024-07-12 21:30:04.461827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.461842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.808 [2024-07-12 21:30:04.461895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.461909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.808 #18 NEW cov: 11692 ft: 13980 corp: 17/30b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 InsertByte- 00:07:25.808 [2024-07-12 21:30:04.501500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.501525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 #19 NEW cov: 11692 ft: 14094 corp: 18/31b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 ChangeByte- 00:07:25.808 [2024-07-12 21:30:04.531585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.531610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 #20 NEW cov: 11692 ft: 14157 corp: 19/32b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:25.808 [2024-07-12 21:30:04.561853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.561877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 [2024-07-12 21:30:04.561931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.808 [2024-07-12 21:30:04.561945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.808 #21 NEW cov: 11692 ft: 14224 corp: 20/34b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 CrossOver- 00:07:26.067 [2024-07-12 21:30:04.602212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.602237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.067 [2024-07-12 21:30:04.602309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.602323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.067 [2024-07-12 21:30:04.602382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.602394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.067 [2024-07-12 21:30:04.602457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.602471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.067 #22 NEW cov: 11692 ft: 14517 corp: 21/38b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 CopyPart- 00:07:26.067 [2024-07-12 21:30:04.642043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.642067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.067 [2024-07-12 21:30:04.642138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.642152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.067 #23 NEW cov: 11692 ft: 14532 corp: 22/40b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 ChangeByte- 00:07:26.067 [2024-07-12 21:30:04.682021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.682045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.067 #24 NEW cov: 11692 ft: 14542 corp: 23/41b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 ChangeBit- 00:07:26.067 [2024-07-12 21:30:04.722173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.067 [2024-07-12 21:30:04.722198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.326 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:26.326 #25 NEW cov: 11715 ft: 14593 corp: 24/42b lim: 5 exec/s: 25 rss: 69Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:26.326 [2024-07-12 21:30:05.022991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.326 [2024-07-12 21:30:05.023024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.326 #26 NEW cov: 11715 ft: 14606 corp: 25/43b lim: 5 exec/s: 26 rss: 69Mb L: 1/4 MS: 1 ChangeByte- 00:07:26.326 [2024-07-12 21:30:05.063230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.326 [2024-07-12 21:30:05.063257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.326 [2024-07-12 21:30:05.063317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.326 [2024-07-12 21:30:05.063346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.326 #27 NEW cov: 11715 ft: 14624 corp: 26/45b lim: 5 exec/s: 27 rss: 69Mb L: 2/4 MS: 1 ChangeBit- 00:07:26.326 [2024-07-12 21:30:05.103324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.326 [2024-07-12 21:30:05.103352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.326 [2024-07-12 21:30:05.103408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.326 [2024-07-12 21:30:05.103423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.585 #28 NEW cov: 11715 ft: 14635 corp: 27/47b lim: 5 exec/s: 28 rss: 69Mb L: 2/4 MS: 1 ShuffleBytes- 00:07:26.585 [2024-07-12 21:30:05.143474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.143499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 [2024-07-12 21:30:05.143573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.143586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.585 #29 NEW cov: 11715 ft: 14652 corp: 28/49b lim: 5 exec/s: 29 rss: 70Mb L: 2/4 MS: 1 EraseBytes- 00:07:26.585 [2024-07-12 21:30:05.183602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.183627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 [2024-07-12 21:30:05.183698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.183712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.585 #30 NEW cov: 11715 ft: 14661 corp: 29/51b lim: 5 exec/s: 30 rss: 70Mb L: 2/4 MS: 1 InsertByte- 00:07:26.585 [2024-07-12 21:30:05.223583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.223609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 #31 NEW cov: 11715 ft: 14676 corp: 30/52b lim: 5 exec/s: 31 rss: 70Mb L: 1/4 MS: 1 CopyPart- 00:07:26.585 [2024-07-12 21:30:05.253809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.253835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 [2024-07-12 21:30:05.253894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.253907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.585 #32 NEW cov: 11715 ft: 14684 corp: 31/54b lim: 5 exec/s: 32 rss: 70Mb L: 2/4 MS: 1 InsertByte- 00:07:26.585 [2024-07-12 21:30:05.293943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.293968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 [2024-07-12 21:30:05.294026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.294040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.585 #33 NEW cov: 11715 ft: 14695 corp: 32/56b lim: 5 exec/s: 33 rss: 70Mb L: 2/4 MS: 1 ChangeBit- 00:07:26.585 [2024-07-12 21:30:05.334036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.334062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 [2024-07-12 21:30:05.334120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.585 [2024-07-12 21:30:05.334134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.585 #34 NEW cov: 11715 ft: 14735 corp: 33/58b lim: 5 exec/s: 34 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:26.844 [2024-07-12 21:30:05.374019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.374045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.844 #35 NEW cov: 11715 ft: 14757 corp: 34/59b lim: 5 exec/s: 35 rss: 70Mb L: 1/4 MS: 1 ChangeBit- 00:07:26.844 [2024-07-12 21:30:05.414392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.414417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.414477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.414491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.414545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.414559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.844 #36 NEW cov: 11715 ft: 14779 corp: 35/62b lim: 5 exec/s: 36 rss: 70Mb L: 3/4 MS: 1 InsertByte- 00:07:26.844 [2024-07-12 21:30:05.454508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.454533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.454591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.454605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.454661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.454674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.494634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.494660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.494734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.494751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.494809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.494823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.844 #38 NEW cov: 11715 ft: 14785 corp: 36/65b lim: 5 exec/s: 38 rss: 70Mb L: 3/4 MS: 2 ChangeBit-CopyPart- 00:07:26.844 [2024-07-12 21:30:05.534459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.534484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.844 #39 NEW cov: 11715 ft: 14790 corp: 37/66b lim: 5 exec/s: 39 rss: 70Mb L: 1/4 MS: 1 CrossOver- 00:07:26.844 [2024-07-12 21:30:05.574565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.574591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.844 #40 NEW cov: 11715 ft: 14800 corp: 38/67b lim: 5 exec/s: 40 rss: 70Mb L: 1/4 MS: 1 EraseBytes- 00:07:26.844 [2024-07-12 21:30:05.614767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.614792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.844 [2024-07-12 21:30:05.614866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.844 [2024-07-12 21:30:05.614880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.103 #41 NEW cov: 11715 ft: 14805 corp: 39/69b lim: 5 exec/s: 41 rss: 70Mb L: 2/4 MS: 1 CrossOver- 00:07:27.103 [2024-07-12 21:30:05.654896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.103 [2024-07-12 21:30:05.654921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.103 [2024-07-12 21:30:05.654979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.103 [2024-07-12 21:30:05.654993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.103 #42 NEW cov: 11715 ft: 14829 corp: 40/71b lim: 5 exec/s: 42 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:27.103 [2024-07-12 21:30:05.695197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.103 [2024-07-12 21:30:05.695222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.103 [2024-07-12 21:30:05.695281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.103 [2024-07-12 21:30:05.695295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.103 [2024-07-12 21:30:05.695351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.103 [2024-07-12 21:30:05.695367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.103 #43 NEW cov: 11715 ft: 14836 corp: 41/74b lim: 5 exec/s: 43 rss: 70Mb L: 3/4 MS: 1 CopyPart- 00:07:27.103 [2024-07-12 21:30:05.735166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.103 [2024-07-12 21:30:05.735190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.103 [2024-07-12 21:30:05.735261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.103 [2024-07-12 21:30:05.735274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.103 #44 NEW cov: 11715 ft: 14862 corp: 42/76b lim: 5 exec/s: 44 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:27.104 [2024-07-12 21:30:05.775293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.104 [2024-07-12 21:30:05.775317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.104 [2024-07-12 21:30:05.775388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.104 [2024-07-12 21:30:05.775401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.104 #45 NEW cov: 11715 ft: 14873 corp: 43/78b lim: 5 exec/s: 45 rss: 70Mb L: 2/4 MS: 1 ChangeBinInt- 00:07:27.104 [2024-07-12 21:30:05.815413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.104 [2024-07-12 21:30:05.815437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.104 [2024-07-12 21:30:05.815514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.104 [2024-07-12 21:30:05.815528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.104 #46 NEW cov: 11715 ft: 14880 corp: 44/80b lim: 5 exec/s: 23 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:27.104 #46 DONE cov: 11715 ft: 14880 corp: 44/80b lim: 5 exec/s: 23 rss: 70Mb 00:07:27.104 Done 46 runs in 2 second(s) 00:07:27.363 21:30:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:27.363 21:30:05 -- ../common.sh@72 -- # (( i++ )) 00:07:27.363 21:30:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:27.363 21:30:05 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:27.363 21:30:05 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:27.363 21:30:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:27.363 21:30:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:27.363 21:30:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:27.363 21:30:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:27.363 21:30:05 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:27.363 21:30:05 -- nvmf/run.sh@29 -- # port=4409 00:07:27.363 21:30:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:27.363 21:30:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:27.363 21:30:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:27.363 21:30:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:27.363 [2024-07-12 21:30:06.006826] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:27.363 [2024-07-12 21:30:06.006893] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3581612 ] 00:07:27.363 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.622 [2024-07-12 21:30:06.182707] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.622 [2024-07-12 21:30:06.246360] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:27.622 [2024-07-12 21:30:06.246502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.622 [2024-07-12 21:30:06.304647] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:27.622 [2024-07-12 21:30:06.320946] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:27.622 INFO: Running with entropic power schedule (0xFF, 100). 00:07:27.622 INFO: Seed: 3186428785 00:07:27.622 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:27.622 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:27.622 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:27.622 INFO: A corpus is not provided, starting from an empty corpus 00:07:27.622 [2024-07-12 21:30:06.366141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.622 [2024-07-12 21:30:06.366171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.622 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:27.622 [2024-07-12 21:30:06.396078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.622 [2024-07-12 21:30:06.396105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.881 #3 NEW cov: 11601 ft: 11928 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeBinInt- 00:07:27.881 [2024-07-12 21:30:06.436205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.436232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.881 #4 NEW cov: 11607 ft: 12172 corp: 3/3b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 CrossOver- 00:07:27.881 [2024-07-12 21:30:06.476318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.476343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.881 #5 NEW cov: 11692 ft: 12421 corp: 4/4b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:27.881 [2024-07-12 21:30:06.516582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.516607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.881 [2024-07-12 21:30:06.516661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.516675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.881 #6 NEW cov: 11692 ft: 13264 corp: 5/6b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:27.881 [2024-07-12 21:30:06.556555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.556581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.881 #7 NEW cov: 11692 ft: 13363 corp: 6/7b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 CrossOver- 00:07:27.881 [2024-07-12 21:30:06.596830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.596854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.881 [2024-07-12 21:30:06.596907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.596921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.881 #8 NEW cov: 11692 ft: 13468 corp: 7/9b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ChangeBit- 00:07:27.881 [2024-07-12 21:30:06.636778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:27.881 [2024-07-12 21:30:06.636802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.881 #9 NEW cov: 11692 ft: 13478 corp: 8/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:28.140 [2024-07-12 21:30:06.676929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.676955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.140 #10 NEW cov: 11692 ft: 13517 corp: 9/11b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 CrossOver- 00:07:28.140 [2024-07-12 21:30:06.717026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.717051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.140 #11 NEW cov: 11692 ft: 13567 corp: 10/12b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeBit- 00:07:28.140 [2024-07-12 21:30:06.747266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.747291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.140 [2024-07-12 21:30:06.747361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.747374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.140 #12 NEW cov: 11692 ft: 13614 corp: 11/14b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:28.140 [2024-07-12 21:30:06.787395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.787420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.140 [2024-07-12 21:30:06.787492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.787506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.140 #13 NEW cov: 11692 ft: 13670 corp: 12/16b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:07:28.140 [2024-07-12 21:30:06.827503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.827528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.140 [2024-07-12 21:30:06.827595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.827608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.140 #14 NEW cov: 11692 ft: 13705 corp: 13/18b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:07:28.140 [2024-07-12 21:30:06.867736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.867761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.140 [2024-07-12 21:30:06.867817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.867830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.140 [2024-07-12 21:30:06.867883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.867897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.140 #15 NEW cov: 11692 ft: 13939 corp: 14/21b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 CopyPart- 00:07:28.140 [2024-07-12 21:30:06.907764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.907788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.140 [2024-07-12 21:30:06.907855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.140 [2024-07-12 21:30:06.907869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.399 #16 NEW cov: 11692 ft: 14026 corp: 15/23b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 InsertByte- 00:07:28.399 [2024-07-12 21:30:06.947879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:06.947903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.399 [2024-07-12 21:30:06.947959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:06.947973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.399 #17 NEW cov: 11692 ft: 14045 corp: 16/25b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ChangeByte- 00:07:28.399 [2024-07-12 21:30:06.987982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:06.988007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.399 [2024-07-12 21:30:06.988060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:06.988076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.399 #18 NEW cov: 11692 ft: 14056 corp: 17/27b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 InsertByte- 00:07:28.399 [2024-07-12 21:30:07.027932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:07.027956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.399 #19 NEW cov: 11692 ft: 14087 corp: 18/28b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 EraseBytes- 00:07:28.399 [2024-07-12 21:30:07.068093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:07.068117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.399 #20 NEW cov: 11692 ft: 14142 corp: 19/29b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeBit- 00:07:28.399 [2024-07-12 21:30:07.108179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:07.108203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.399 #21 NEW cov: 11692 ft: 14161 corp: 20/30b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeByte- 00:07:28.399 [2024-07-12 21:30:07.138456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:07.138480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.399 [2024-07-12 21:30:07.138534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:07.138548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.399 #22 NEW cov: 11692 ft: 14163 corp: 21/32b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:28.399 [2024-07-12 21:30:07.178380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.399 [2024-07-12 21:30:07.178405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.659 #23 NEW cov: 11692 ft: 14185 corp: 22/33b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 CopyPart- 00:07:28.659 [2024-07-12 21:30:07.218688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.659 [2024-07-12 21:30:07.218712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.659 [2024-07-12 21:30:07.218766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.659 [2024-07-12 21:30:07.218778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.659 #24 NEW cov: 11692 ft: 14207 corp: 23/35b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 CopyPart- 00:07:28.659 [2024-07-12 21:30:07.258833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.659 [2024-07-12 21:30:07.258858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.659 [2024-07-12 21:30:07.258913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.659 [2024-07-12 21:30:07.258926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.918 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:28.918 #25 NEW cov: 11715 ft: 14247 corp: 24/37b lim: 5 exec/s: 25 rss: 69Mb L: 2/3 MS: 1 InsertByte- 00:07:28.918 [2024-07-12 21:30:07.569821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.918 [2024-07-12 21:30:07.569854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.918 [2024-07-12 21:30:07.569916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.918 [2024-07-12 21:30:07.569930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.918 #26 NEW cov: 11715 ft: 14278 corp: 25/39b lim: 5 exec/s: 26 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:07:28.918 [2024-07-12 21:30:07.609798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.918 [2024-07-12 21:30:07.609824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.918 [2024-07-12 21:30:07.609902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.918 [2024-07-12 21:30:07.609916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.919 #27 NEW cov: 11715 ft: 14304 corp: 26/41b lim: 5 exec/s: 27 rss: 69Mb L: 2/3 MS: 1 ChangeByte- 00:07:28.919 [2024-07-12 21:30:07.649708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.919 [2024-07-12 21:30:07.649733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.919 #28 NEW cov: 11715 ft: 14359 corp: 27/42b lim: 5 exec/s: 28 rss: 69Mb L: 1/3 MS: 1 ChangeBit- 00:07:28.919 [2024-07-12 21:30:07.690234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.919 [2024-07-12 21:30:07.690259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.919 [2024-07-12 21:30:07.690321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.919 [2024-07-12 21:30:07.690335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.919 [2024-07-12 21:30:07.690395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:28.919 [2024-07-12 21:30:07.690409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.178 #29 NEW cov: 11715 ft: 14426 corp: 28/45b lim: 5 exec/s: 29 rss: 69Mb L: 3/3 MS: 1 ShuffleBytes- 00:07:29.178 [2024-07-12 21:30:07.729962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.729987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.178 #30 NEW cov: 11715 ft: 14478 corp: 29/46b lim: 5 exec/s: 30 rss: 69Mb L: 1/3 MS: 1 ChangeByte- 00:07:29.178 [2024-07-12 21:30:07.770120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.770145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.178 #31 NEW cov: 11715 ft: 14493 corp: 30/47b lim: 5 exec/s: 31 rss: 69Mb L: 1/3 MS: 1 ChangeBit- 00:07:29.178 [2024-07-12 21:30:07.810249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.810275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.178 #32 NEW cov: 11715 ft: 14525 corp: 31/48b lim: 5 exec/s: 32 rss: 70Mb L: 1/3 MS: 1 CrossOver- 00:07:29.178 [2024-07-12 21:30:07.851019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.851044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.851119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.851133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.851192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.851205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.851264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.851277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.851335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.851349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:29.178 #33 NEW cov: 11715 ft: 14845 corp: 32/53b lim: 5 exec/s: 33 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:29.178 [2024-07-12 21:30:07.890953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.890978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.891037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.891050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.891108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.891121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.891177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.891193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.178 #34 NEW cov: 11715 ft: 14852 corp: 33/57b lim: 5 exec/s: 34 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:29.178 [2024-07-12 21:30:07.930794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.930820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.178 [2024-07-12 21:30:07.930896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.178 [2024-07-12 21:30:07.930910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.178 #35 NEW cov: 11715 ft: 14856 corp: 34/59b lim: 5 exec/s: 35 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:29.437 [2024-07-12 21:30:07.970687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.437 [2024-07-12 21:30:07.970713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.437 #36 NEW cov: 11715 ft: 14865 corp: 35/60b lim: 5 exec/s: 36 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:07:29.437 [2024-07-12 21:30:08.000905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.437 [2024-07-12 21:30:08.000931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.437 [2024-07-12 21:30:08.000989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.437 [2024-07-12 21:30:08.001003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.437 #37 NEW cov: 11715 ft: 14866 corp: 36/62b lim: 5 exec/s: 37 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:29.437 [2024-07-12 21:30:08.041078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.437 [2024-07-12 21:30:08.041104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.437 [2024-07-12 21:30:08.041161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.437 [2024-07-12 21:30:08.041175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.438 #38 NEW cov: 11715 ft: 14891 corp: 37/64b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:29.438 [2024-07-12 21:30:08.081187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.081212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.438 [2024-07-12 21:30:08.081271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.081285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.438 #39 NEW cov: 11715 ft: 14904 corp: 38/66b lim: 5 exec/s: 39 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:29.438 [2024-07-12 21:30:08.121280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.121308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.438 [2024-07-12 21:30:08.121370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.121383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.438 #40 NEW cov: 11715 ft: 14968 corp: 39/68b lim: 5 exec/s: 40 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:29.438 [2024-07-12 21:30:08.161716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.161741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.438 [2024-07-12 21:30:08.161803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.161816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.438 [2024-07-12 21:30:08.161872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.161885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.438 [2024-07-12 21:30:08.161941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.161953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.438 #41 NEW cov: 11715 ft: 15003 corp: 40/72b lim: 5 exec/s: 41 rss: 70Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:29.438 [2024-07-12 21:30:08.201527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.201552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.438 [2024-07-12 21:30:08.201612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.438 [2024-07-12 21:30:08.201626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.697 #42 NEW cov: 11715 ft: 15012 corp: 41/74b lim: 5 exec/s: 42 rss: 70Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:29.697 [2024-07-12 21:30:08.241811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.697 [2024-07-12 21:30:08.241836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.697 [2024-07-12 21:30:08.241912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.697 [2024-07-12 21:30:08.241927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.697 [2024-07-12 21:30:08.241989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.697 [2024-07-12 21:30:08.242002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.697 #43 NEW cov: 11715 ft: 15017 corp: 42/77b lim: 5 exec/s: 43 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:29.697 [2024-07-12 21:30:08.281996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.697 [2024-07-12 21:30:08.282022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.697 [2024-07-12 21:30:08.282085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.697 [2024-07-12 21:30:08.282099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.697 [2024-07-12 21:30:08.282158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.697 [2024-07-12 21:30:08.282171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.697 #44 NEW cov: 11715 ft: 15029 corp: 43/80b lim: 5 exec/s: 44 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:29.698 [2024-07-12 21:30:08.321859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.698 [2024-07-12 21:30:08.321884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.698 [2024-07-12 21:30:08.321943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.698 [2024-07-12 21:30:08.321955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.698 #45 NEW cov: 11715 ft: 15034 corp: 44/82b lim: 5 exec/s: 45 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:29.698 [2024-07-12 21:30:08.361973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.698 [2024-07-12 21:30:08.361998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.698 [2024-07-12 21:30:08.362057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:29.698 [2024-07-12 21:30:08.362070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.698 #46 NEW cov: 11715 ft: 15049 corp: 45/84b lim: 5 exec/s: 23 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:29.698 #46 DONE cov: 11715 ft: 15049 corp: 45/84b lim: 5 exec/s: 23 rss: 70Mb 00:07:29.698 Done 46 runs in 2 second(s) 00:07:29.957 21:30:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:29.957 21:30:08 -- ../common.sh@72 -- # (( i++ )) 00:07:29.957 21:30:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:29.957 21:30:08 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:29.957 21:30:08 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:29.957 21:30:08 -- nvmf/run.sh@24 -- # local timen=1 00:07:29.957 21:30:08 -- nvmf/run.sh@25 -- # local core=0x1 00:07:29.957 21:30:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:29.957 21:30:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:29.957 21:30:08 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:29.957 21:30:08 -- nvmf/run.sh@29 -- # port=4410 00:07:29.957 21:30:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:29.957 21:30:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:29.957 21:30:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:29.957 21:30:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:29.957 [2024-07-12 21:30:08.546669] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:29.957 [2024-07-12 21:30:08.546738] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3582160 ] 00:07:29.957 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.957 [2024-07-12 21:30:08.721634] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.217 [2024-07-12 21:30:08.785136] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:30.217 [2024-07-12 21:30:08.785256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.217 [2024-07-12 21:30:08.843063] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.217 [2024-07-12 21:30:08.859351] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:30.217 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.217 INFO: Seed: 1428441538 00:07:30.217 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:30.217 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:30.217 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:30.217 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.217 #2 INITED exec/s: 0 rss: 60Mb 00:07:30.217 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.217 This may also happen if the target rejected all inputs we tried so far 00:07:30.217 [2024-07-12 21:30:08.904645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.217 [2024-07-12 21:30:08.904673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.217 [2024-07-12 21:30:08.904743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.217 [2024-07-12 21:30:08.904758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.476 NEW_FUNC[1/670]: 0x48daf0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:30.476 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:30.476 #8 NEW cov: 11511 ft: 11512 corp: 2/22b lim: 40 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:07:30.476 [2024-07-12 21:30:09.215476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.476 [2024-07-12 21:30:09.215509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.476 [2024-07-12 21:30:09.215574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.476 [2024-07-12 21:30:09.215589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.476 #9 NEW cov: 11624 ft: 12065 corp: 3/43b lim: 40 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 CrossOver- 00:07:30.476 [2024-07-12 21:30:09.255510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.476 [2024-07-12 21:30:09.255538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.476 [2024-07-12 21:30:09.255602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.476 [2024-07-12 21:30:09.255616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.736 #10 NEW cov: 11630 ft: 12307 corp: 4/64b lim: 40 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeBinInt- 00:07:30.736 [2024-07-12 21:30:09.295760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.295785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.295850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.295864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.295927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.295940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.736 #11 NEW cov: 11715 ft: 12780 corp: 5/89b lim: 40 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:30.736 [2024-07-12 21:30:09.335795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.335820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.335881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.335894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.736 #14 NEW cov: 11715 ft: 12912 corp: 6/111b lim: 40 exec/s: 0 rss: 67Mb L: 22/25 MS: 3 ChangeBit-ChangeBit-CrossOver- 00:07:30.736 [2024-07-12 21:30:09.376016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.376041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.376106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.376120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.376181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.376194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.736 #15 NEW cov: 11715 ft: 12968 corp: 7/136b lim: 40 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 ChangeBit- 00:07:30.736 [2024-07-12 21:30:09.416152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.416177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.416259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00005dff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.416276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.416340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.416354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.736 #16 NEW cov: 11715 ft: 13170 corp: 8/162b lim: 40 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 InsertByte- 00:07:30.736 [2024-07-12 21:30:09.456294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.456320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.456381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.456395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.456457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.456471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.736 #17 NEW cov: 11715 ft: 13220 corp: 9/187b lim: 40 exec/s: 0 rss: 68Mb L: 25/26 MS: 1 CrossOver- 00:07:30.736 [2024-07-12 21:30:09.496238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.496265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.736 [2024-07-12 21:30:09.496326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.736 [2024-07-12 21:30:09.496340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.736 #18 NEW cov: 11715 ft: 13374 corp: 10/206b lim: 40 exec/s: 0 rss: 68Mb L: 19/26 MS: 1 EraseBytes- 00:07:30.996 [2024-07-12 21:30:09.536341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.536367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.536425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.536439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.996 #19 NEW cov: 11715 ft: 13398 corp: 11/227b lim: 40 exec/s: 0 rss: 68Mb L: 21/26 MS: 1 ChangeBinInt- 00:07:30.996 [2024-07-12 21:30:09.576459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.576485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.576545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:001d0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.576559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.996 #20 NEW cov: 11715 ft: 13492 corp: 12/249b lim: 40 exec/s: 0 rss: 68Mb L: 22/26 MS: 1 InsertByte- 00:07:30.996 [2024-07-12 21:30:09.616864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.616889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.616951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.616965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.617024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.617038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.617075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.617088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.996 #21 NEW cov: 11715 ft: 13961 corp: 13/282b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CopyPart- 00:07:30.996 [2024-07-12 21:30:09.656861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.656886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.656949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00003fff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.656963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.657025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.657038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.996 #22 NEW cov: 11715 ft: 13977 corp: 14/307b lim: 40 exec/s: 0 rss: 68Mb L: 25/33 MS: 1 ChangeByte- 00:07:30.996 [2024-07-12 21:30:09.696695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.696720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.996 #23 NEW cov: 11715 ft: 14301 corp: 15/321b lim: 40 exec/s: 0 rss: 68Mb L: 14/33 MS: 1 EraseBytes- 00:07:30.996 [2024-07-12 21:30:09.736826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.736851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.996 #24 NEW cov: 11715 ft: 14382 corp: 16/332b lim: 40 exec/s: 0 rss: 68Mb L: 11/33 MS: 1 EraseBytes- 00:07:30.996 [2024-07-12 21:30:09.777415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.777446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.777509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.777526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.777587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.777601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.996 [2024-07-12 21:30:09.777660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00060000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.996 [2024-07-12 21:30:09.777674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.256 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:31.256 #25 NEW cov: 11738 ft: 14422 corp: 17/365b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:07:31.256 [2024-07-12 21:30:09.827497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.827523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.256 [2024-07-12 21:30:09.827584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.827597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.256 [2024-07-12 21:30:09.827656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.827669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.256 [2024-07-12 21:30:09.827726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00060000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.827740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.256 #26 NEW cov: 11738 ft: 14461 corp: 18/398b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 ShuffleBytes- 00:07:31.256 [2024-07-12 21:30:09.867244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.867269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.256 #27 NEW cov: 11738 ft: 14468 corp: 19/412b lim: 40 exec/s: 27 rss: 69Mb L: 14/33 MS: 1 ShuffleBytes- 00:07:31.256 [2024-07-12 21:30:09.907481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.907506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.256 [2024-07-12 21:30:09.907566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.907580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.256 #28 NEW cov: 11738 ft: 14502 corp: 20/434b lim: 40 exec/s: 28 rss: 69Mb L: 22/33 MS: 1 CopyPart- 00:07:31.256 [2024-07-12 21:30:09.947567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:000a0300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.947595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.256 [2024-07-12 21:30:09.947659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:001d0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.947672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.256 #29 NEW cov: 11738 ft: 14514 corp: 21/456b lim: 40 exec/s: 29 rss: 69Mb L: 22/33 MS: 1 ShuffleBytes- 00:07:31.256 [2024-07-12 21:30:09.987703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.987729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.256 [2024-07-12 21:30:09.987792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:001d0000 cdw11:000a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:09.987806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.256 #30 NEW cov: 11738 ft: 14521 corp: 22/478b lim: 40 exec/s: 30 rss: 69Mb L: 22/33 MS: 1 ShuffleBytes- 00:07:31.256 [2024-07-12 21:30:10.027834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a200300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:10.027860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.256 [2024-07-12 21:30:10.027939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:001d0000 cdw11:000a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.256 [2024-07-12 21:30:10.027954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.516 #31 NEW cov: 11738 ft: 14534 corp: 23/500b lim: 40 exec/s: 31 rss: 69Mb L: 22/33 MS: 1 ChangeBit- 00:07:31.516 [2024-07-12 21:30:10.067960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.068003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.068069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:21000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.068083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.516 #32 NEW cov: 11738 ft: 14559 corp: 24/522b lim: 40 exec/s: 32 rss: 69Mb L: 22/33 MS: 1 InsertByte- 00:07:31.516 [2024-07-12 21:30:10.098158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.098186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.098267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00ff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.098282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.098348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.098362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.516 #33 NEW cov: 11738 ft: 14598 corp: 25/547b lim: 40 exec/s: 33 rss: 69Mb L: 25/33 MS: 1 ShuffleBytes- 00:07:31.516 [2024-07-12 21:30:10.138203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.138228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.138303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:001d0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.138318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.516 #34 NEW cov: 11738 ft: 14680 corp: 26/569b lim: 40 exec/s: 34 rss: 69Mb L: 22/33 MS: 1 CrossOver- 00:07:31.516 [2024-07-12 21:30:10.178419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.178448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.178525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:21000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.178539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.178611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.178625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.516 #35 NEW cov: 11738 ft: 14692 corp: 27/595b lim: 40 exec/s: 35 rss: 69Mb L: 26/33 MS: 1 InsertRepeatedBytes- 00:07:31.516 [2024-07-12 21:30:10.218369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.218394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.218457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.218471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.516 #36 NEW cov: 11738 ft: 14707 corp: 28/617b lim: 40 exec/s: 36 rss: 69Mb L: 22/33 MS: 1 ChangeByte- 00:07:31.516 [2024-07-12 21:30:10.258658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.258683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.258745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.258759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.516 [2024-07-12 21:30:10.258818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.516 [2024-07-12 21:30:10.258832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.516 #37 NEW cov: 11738 ft: 14744 corp: 29/643b lim: 40 exec/s: 37 rss: 69Mb L: 26/33 MS: 1 CopyPart- 00:07:31.776 [2024-07-12 21:30:10.298783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:29000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.298813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.298877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.298891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.298949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.298963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.776 #38 NEW cov: 11738 ft: 14848 corp: 30/668b lim: 40 exec/s: 38 rss: 69Mb L: 25/33 MS: 1 ChangeByte- 00:07:31.776 [2024-07-12 21:30:10.338746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:29000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.338770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.338832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.338845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.776 #39 NEW cov: 11738 ft: 14908 corp: 31/687b lim: 40 exec/s: 39 rss: 69Mb L: 19/33 MS: 1 EraseBytes- 00:07:31.776 [2024-07-12 21:30:10.379070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.379094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.379153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.379167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.379227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.379241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.379299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.379312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.776 #40 NEW cov: 11738 ft: 14914 corp: 32/726b lim: 40 exec/s: 40 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:31.776 [2024-07-12 21:30:10.419222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.419246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.419304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:21000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.419318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.419381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00002100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.419394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.419455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.419468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.776 #41 NEW cov: 11738 ft: 14941 corp: 33/765b lim: 40 exec/s: 41 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:07:31.776 [2024-07-12 21:30:10.459491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.459517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.459579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00003fff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.459592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.459654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00210000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.459667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.459728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.459742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.459802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.459816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:31.776 #42 NEW cov: 11738 ft: 14991 corp: 34/805b lim: 40 exec/s: 42 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:07:31.776 [2024-07-12 21:30:10.499024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.499048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.776 [2024-07-12 21:30:10.539159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.776 [2024-07-12 21:30:10.539184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.035 #44 NEW cov: 11738 ft: 14995 corp: 35/816b lim: 40 exec/s: 44 rss: 70Mb L: 11/40 MS: 2 ChangeBit-CopyPart- 00:07:32.036 [2024-07-12 21:30:10.579552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.579577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.579642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.579655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.579736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.579750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.036 #45 NEW cov: 11738 ft: 15000 corp: 36/843b lim: 40 exec/s: 45 rss: 70Mb L: 27/40 MS: 1 CrossOver- 00:07:32.036 [2024-07-12 21:30:10.619642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.619666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.619743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00003fff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.619757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.619820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.619834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.036 #46 NEW cov: 11738 ft: 15007 corp: 37/868b lim: 40 exec/s: 46 rss: 70Mb L: 25/40 MS: 1 ChangeBit- 00:07:32.036 [2024-07-12 21:30:10.659803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.659827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.659890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:21000000 cdw11:001a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.659903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.659961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.659975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.036 #47 NEW cov: 11738 ft: 15018 corp: 38/894b lim: 40 exec/s: 47 rss: 70Mb L: 26/40 MS: 1 ChangeBinInt- 00:07:32.036 [2024-07-12 21:30:10.699625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.699649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.036 #48 NEW cov: 11738 ft: 15029 corp: 39/905b lim: 40 exec/s: 48 rss: 70Mb L: 11/40 MS: 1 ShuffleBytes- 00:07:32.036 [2024-07-12 21:30:10.730120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000b0b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.730145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.730206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0b0b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.730220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.730282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0b0b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.730297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.730357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0b000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.730370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.036 #49 NEW cov: 11738 ft: 15040 corp: 40/938b lim: 40 exec/s: 49 rss: 70Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:07:32.036 [2024-07-12 21:30:10.770245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.770268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.770330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.770344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.770405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.770419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.770482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.770496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.036 #50 NEW cov: 11738 ft: 15046 corp: 41/977b lim: 40 exec/s: 50 rss: 70Mb L: 39/40 MS: 1 ShuffleBytes- 00:07:32.036 [2024-07-12 21:30:10.810370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a00032c cdw11:2c2c2c2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.810394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.810454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c2c2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.810468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.810526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:2c2c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.810540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.036 [2024-07-12 21:30:10.810597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:1d000000 cdw11:000a0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.036 [2024-07-12 21:30:10.810610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.296 #51 NEW cov: 11738 ft: 15061 corp: 42/1014b lim: 40 exec/s: 51 rss: 70Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:07:32.296 [2024-07-12 21:30:10.850497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.850521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.296 [2024-07-12 21:30:10.850568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.850584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.296 [2024-07-12 21:30:10.850660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.850674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.296 [2024-07-12 21:30:10.850734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0000008f cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.850748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.296 #52 NEW cov: 11738 ft: 15081 corp: 43/1053b lim: 40 exec/s: 52 rss: 70Mb L: 39/40 MS: 1 ChangeByte- 00:07:32.296 [2024-07-12 21:30:10.890610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.890635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.296 [2024-07-12 21:30:10.890694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.890707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.296 [2024-07-12 21:30:10.890754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.890767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.296 [2024-07-12 21:30:10.890829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00060000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.296 [2024-07-12 21:30:10.890842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.296 #53 NEW cov: 11738 ft: 15086 corp: 44/1086b lim: 40 exec/s: 26 rss: 70Mb L: 33/40 MS: 1 ShuffleBytes- 00:07:32.296 #53 DONE cov: 11738 ft: 15086 corp: 44/1086b lim: 40 exec/s: 26 rss: 70Mb 00:07:32.296 Done 53 runs in 2 second(s) 00:07:32.296 21:30:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:32.296 21:30:11 -- ../common.sh@72 -- # (( i++ )) 00:07:32.296 21:30:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.296 21:30:11 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:32.296 21:30:11 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:32.296 21:30:11 -- nvmf/run.sh@24 -- # local timen=1 00:07:32.296 21:30:11 -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.296 21:30:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:32.296 21:30:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:32.296 21:30:11 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:32.296 21:30:11 -- nvmf/run.sh@29 -- # port=4411 00:07:32.296 21:30:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:32.296 21:30:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:32.296 21:30:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.296 21:30:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:32.296 [2024-07-12 21:30:11.071403] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:32.296 [2024-07-12 21:30:11.071507] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3582463 ] 00:07:32.555 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.555 [2024-07-12 21:30:11.255435] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.555 [2024-07-12 21:30:11.318517] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:32.555 [2024-07-12 21:30:11.318660] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.814 [2024-07-12 21:30:11.377037] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.814 [2024-07-12 21:30:11.393342] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:32.814 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.814 INFO: Seed: 3963447509 00:07:32.814 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:32.814 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:32.814 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:32.814 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.814 #2 INITED exec/s: 0 rss: 60Mb 00:07:32.814 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.814 This may also happen if the target rejected all inputs we tried so far 00:07:32.814 [2024-07-12 21:30:11.469858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.815 [2024-07-12 21:30:11.469896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.815 [2024-07-12 21:30:11.470022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.815 [2024-07-12 21:30:11.470042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.815 [2024-07-12 21:30:11.470175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.815 [2024-07-12 21:30:11.470191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.073 NEW_FUNC[1/671]: 0x48f860 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:33.073 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:33.074 #9 NEW cov: 11523 ft: 11524 corp: 2/28b lim: 40 exec/s: 0 rss: 67Mb L: 27/27 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:07:33.074 [2024-07-12 21:30:11.800716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.074 [2024-07-12 21:30:11.800761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.074 [2024-07-12 21:30:11.800913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.074 [2024-07-12 21:30:11.800937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.074 [2024-07-12 21:30:11.801083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:59000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.074 [2024-07-12 21:30:11.801105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.074 #10 NEW cov: 11636 ft: 12118 corp: 3/55b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 ChangeByte- 00:07:33.074 [2024-07-12 21:30:11.850829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.074 [2024-07-12 21:30:11.850856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.074 [2024-07-12 21:30:11.850987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.074 [2024-07-12 21:30:11.851003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.074 [2024-07-12 21:30:11.851139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:59000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.074 [2024-07-12 21:30:11.851157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.333 #11 NEW cov: 11642 ft: 12407 corp: 4/82b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 CopyPart- 00:07:33.333 [2024-07-12 21:30:11.890592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:11.890620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.333 [2024-07-12 21:30:11.890757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000059 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:11.890773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.333 #12 NEW cov: 11727 ft: 12960 corp: 5/100b lim: 40 exec/s: 0 rss: 68Mb L: 18/27 MS: 1 EraseBytes- 00:07:33.333 [2024-07-12 21:30:11.930769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:11.930798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.333 [2024-07-12 21:30:11.930943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:11.930961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.333 #13 NEW cov: 11727 ft: 13064 corp: 6/121b lim: 40 exec/s: 0 rss: 68Mb L: 21/27 MS: 1 InsertRepeatedBytes- 00:07:33.333 [2024-07-12 21:30:11.970919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:11.970946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.333 [2024-07-12 21:30:11.971080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:11.971097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.333 #14 NEW cov: 11727 ft: 13105 corp: 7/142b lim: 40 exec/s: 0 rss: 68Mb L: 21/27 MS: 1 ShuffleBytes- 00:07:33.333 [2024-07-12 21:30:12.010949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:12.010975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.333 [2024-07-12 21:30:12.011107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00590000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:12.011127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.333 #15 NEW cov: 11727 ft: 13164 corp: 8/163b lim: 40 exec/s: 0 rss: 68Mb L: 21/27 MS: 1 ShuffleBytes- 00:07:33.333 [2024-07-12 21:30:12.051375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04b20000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:12.051402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.333 [2024-07-12 21:30:12.051542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:12.051558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.333 [2024-07-12 21:30:12.051701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:59000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.333 [2024-07-12 21:30:12.051718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.333 #16 NEW cov: 11727 ft: 13225 corp: 9/190b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 ChangeByte- 00:07:33.334 [2024-07-12 21:30:12.091514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.334 [2024-07-12 21:30:12.091541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.334 [2024-07-12 21:30:12.091673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00590000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.334 [2024-07-12 21:30:12.091689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.334 [2024-07-12 21:30:12.091815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.334 [2024-07-12 21:30:12.091831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.593 #17 NEW cov: 11727 ft: 13316 corp: 10/221b lim: 40 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 CopyPart- 00:07:33.593 [2024-07-12 21:30:12.131651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.131678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.593 [2024-07-12 21:30:12.131814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:fc000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.131830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.593 [2024-07-12 21:30:12.131967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.131984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.593 #18 NEW cov: 11727 ft: 13349 corp: 11/248b lim: 40 exec/s: 0 rss: 69Mb L: 27/31 MS: 1 ChangeBinInt- 00:07:33.593 [2024-07-12 21:30:12.171500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.171525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.593 [2024-07-12 21:30:12.171661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.171676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.593 #19 NEW cov: 11727 ft: 13375 corp: 12/269b lim: 40 exec/s: 0 rss: 69Mb L: 21/31 MS: 1 ShuffleBytes- 00:07:33.593 [2024-07-12 21:30:12.211945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.211971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.593 [2024-07-12 21:30:12.212120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.212136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.593 [2024-07-12 21:30:12.212265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.212281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.593 #23 NEW cov: 11727 ft: 13389 corp: 13/293b lim: 40 exec/s: 0 rss: 69Mb L: 24/31 MS: 4 CrossOver-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:33.593 [2024-07-12 21:30:12.251749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.251775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.593 [2024-07-12 21:30:12.251895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.593 [2024-07-12 21:30:12.251913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.593 #24 NEW cov: 11727 ft: 13421 corp: 14/314b lim: 40 exec/s: 0 rss: 69Mb L: 21/31 MS: 1 ChangeBit- 00:07:33.594 [2024-07-12 21:30:12.291836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.594 [2024-07-12 21:30:12.291862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.594 [2024-07-12 21:30:12.291992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00590000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.594 [2024-07-12 21:30:12.292009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.594 #25 NEW cov: 11727 ft: 13431 corp: 15/335b lim: 40 exec/s: 0 rss: 69Mb L: 21/31 MS: 1 ShuffleBytes- 00:07:33.594 [2024-07-12 21:30:12.331714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.594 [2024-07-12 21:30:12.331742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.594 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:33.594 #27 NEW cov: 11750 ft: 14204 corp: 16/350b lim: 40 exec/s: 0 rss: 69Mb L: 15/31 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:33.594 [2024-07-12 21:30:12.372367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.594 [2024-07-12 21:30:12.372393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.594 [2024-07-12 21:30:12.372532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffeeff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.594 [2024-07-12 21:30:12.372548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.594 [2024-07-12 21:30:12.372662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff31 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.594 [2024-07-12 21:30:12.372684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.853 #28 NEW cov: 11750 ft: 14216 corp: 17/374b lim: 40 exec/s: 0 rss: 69Mb L: 24/31 MS: 1 ChangeByte- 00:07:33.853 [2024-07-12 21:30:12.412418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.853 [2024-07-12 21:30:12.412447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.853 [2024-07-12 21:30:12.412587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.853 [2024-07-12 21:30:12.412603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.853 [2024-07-12 21:30:12.412733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.853 [2024-07-12 21:30:12.412750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.853 #29 NEW cov: 11750 ft: 14242 corp: 18/399b lim: 40 exec/s: 0 rss: 69Mb L: 25/31 MS: 1 EraseBytes- 00:07:33.853 [2024-07-12 21:30:12.452762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.853 [2024-07-12 21:30:12.452788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.853 [2024-07-12 21:30:12.452915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.452931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.453053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.453068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.453199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000059 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.453215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.854 #30 NEW cov: 11750 ft: 14562 corp: 19/438b lim: 40 exec/s: 30 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:07:33.854 [2024-07-12 21:30:12.492755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.492781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.492918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00590000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.492934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.493067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.493084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.854 #31 NEW cov: 11750 ft: 14589 corp: 20/469b lim: 40 exec/s: 31 rss: 69Mb L: 31/39 MS: 1 CopyPart- 00:07:33.854 [2024-07-12 21:30:12.532279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b000400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.532306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.854 #36 NEW cov: 11750 ft: 14625 corp: 21/484b lim: 40 exec/s: 36 rss: 69Mb L: 15/39 MS: 5 CrossOver-CrossOver-CopyPart-InsertByte-CrossOver- 00:07:33.854 [2024-07-12 21:30:12.573247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.573274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.573412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.573431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.573559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.573578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.573706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00590000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.573724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.854 #37 NEW cov: 11750 ft: 14631 corp: 22/523b lim: 40 exec/s: 37 rss: 69Mb L: 39/39 MS: 1 CrossOver- 00:07:33.854 [2024-07-12 21:30:12.622679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.622707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.854 [2024-07-12 21:30:12.622835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00002759 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.854 [2024-07-12 21:30:12.622852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.113 #38 NEW cov: 11750 ft: 14649 corp: 23/545b lim: 40 exec/s: 38 rss: 69Mb L: 22/39 MS: 1 InsertByte- 00:07:34.113 [2024-07-12 21:30:12.662727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.662756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.113 #39 NEW cov: 11750 ft: 14666 corp: 24/558b lim: 40 exec/s: 39 rss: 69Mb L: 13/39 MS: 1 EraseBytes- 00:07:34.113 [2024-07-12 21:30:12.713159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.713186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.113 [2024-07-12 21:30:12.713329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.713347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.113 #40 NEW cov: 11750 ft: 14682 corp: 25/579b lim: 40 exec/s: 40 rss: 69Mb L: 21/39 MS: 1 ShuffleBytes- 00:07:34.113 [2024-07-12 21:30:12.753522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.753548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.113 [2024-07-12 21:30:12.753682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:fc000800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.753700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.113 [2024-07-12 21:30:12.753833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.753850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.113 #41 NEW cov: 11750 ft: 14688 corp: 26/606b lim: 40 exec/s: 41 rss: 69Mb L: 27/39 MS: 1 ChangeBit- 00:07:34.113 [2024-07-12 21:30:12.803835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.803862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.113 [2024-07-12 21:30:12.803991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.113 [2024-07-12 21:30:12.804007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.114 [2024-07-12 21:30:12.804134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.114 [2024-07-12 21:30:12.804150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.114 [2024-07-12 21:30:12.804281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00005900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.114 [2024-07-12 21:30:12.804299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.114 #42 NEW cov: 11750 ft: 14698 corp: 27/639b lim: 40 exec/s: 42 rss: 70Mb L: 33/39 MS: 1 InsertRepeatedBytes- 00:07:34.114 [2024-07-12 21:30:12.854050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.114 [2024-07-12 21:30:12.854076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.114 [2024-07-12 21:30:12.854215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:f3f3f3f3 cdw11:f3f3f3f3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.114 [2024-07-12 21:30:12.854233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.114 [2024-07-12 21:30:12.854357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f3f3f3f3 cdw11:f3f3f3f3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.114 [2024-07-12 21:30:12.854375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.114 [2024-07-12 21:30:12.854510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:f3000000 cdw11:00000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.114 [2024-07-12 21:30:12.854529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.114 #43 NEW cov: 11750 ft: 14703 corp: 28/678b lim: 40 exec/s: 43 rss: 70Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:34.372 [2024-07-12 21:30:12.904158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:06000400 cdw11:00003e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.904184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:12.904308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3e3e3e3e cdw11:3e3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.904327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:12.904448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3e3e3e3e cdw11:3e3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.904464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:12.904598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3e3e3e3e cdw11:3e3e3e3e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.904614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.372 #46 NEW cov: 11750 ft: 14709 corp: 29/710b lim: 40 exec/s: 46 rss: 70Mb L: 32/39 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:34.372 [2024-07-12 21:30:12.954088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04b20000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.954116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:12.954235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.954252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:12.954385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:59000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.954403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.372 #47 NEW cov: 11750 ft: 14715 corp: 30/737b lim: 40 exec/s: 47 rss: 70Mb L: 27/39 MS: 1 ChangeByte- 00:07:34.372 [2024-07-12 21:30:12.994202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.994230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:12.994367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.994385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:12.994515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:59000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:12.994532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.372 #48 NEW cov: 11750 ft: 14810 corp: 31/764b lim: 40 exec/s: 48 rss: 70Mb L: 27/39 MS: 1 ShuffleBytes- 00:07:34.372 [2024-07-12 21:30:13.034053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:13.034079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:13.034203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:80590000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:13.034219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.372 #49 NEW cov: 11750 ft: 14816 corp: 32/785b lim: 40 exec/s: 49 rss: 70Mb L: 21/39 MS: 1 ChangeBit- 00:07:34.372 [2024-07-12 21:30:13.084144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0400ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:13.084171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:13.084300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00002759 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:13.084316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.372 #50 NEW cov: 11750 ft: 14853 corp: 33/807b lim: 40 exec/s: 50 rss: 70Mb L: 22/39 MS: 1 ChangeBinInt- 00:07:34.372 [2024-07-12 21:30:13.124283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fd000000 cdw11:00000008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:13.124310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.372 [2024-07-12 21:30:13.124434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00590000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.372 [2024-07-12 21:30:13.124455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.372 #51 NEW cov: 11750 ft: 14873 corp: 34/828b lim: 40 exec/s: 51 rss: 70Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:34.672 [2024-07-12 21:30:13.164147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.164174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.672 #52 NEW cov: 11750 ft: 14939 corp: 35/841b lim: 40 exec/s: 52 rss: 70Mb L: 13/39 MS: 1 CrossOver- 00:07:34.672 [2024-07-12 21:30:13.204997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04b20000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.205024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.205150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.205168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.205290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:59151515 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.205307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.205426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:15151515 cdw11:15151500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.205454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.672 #53 NEW cov: 11750 ft: 14967 corp: 36/878b lim: 40 exec/s: 53 rss: 70Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:07:34.672 [2024-07-12 21:30:13.245154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04b20000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.245182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.245310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.245326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.245458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00b3ed1a cdw11:1655fd28 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.245473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.245611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:59000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.245630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.672 #54 NEW cov: 11750 ft: 14989 corp: 37/913b lim: 40 exec/s: 54 rss: 70Mb L: 35/39 MS: 1 CMP- DE: "\263\355\032\026U\375(\000"- 00:07:34.672 [2024-07-12 21:30:13.294500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b000400 cdw11:000000b3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.294528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.672 #55 NEW cov: 11750 ft: 14992 corp: 38/928b lim: 40 exec/s: 55 rss: 70Mb L: 15/39 MS: 1 PersAutoDict- DE: "\263\355\032\026U\375(\000"- 00:07:34.672 [2024-07-12 21:30:13.335760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.335786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.335922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000059 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.335939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.336069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.336085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.336221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.336237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.336371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:59000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.336387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.672 #56 NEW cov: 11750 ft: 15046 corp: 39/968b lim: 40 exec/s: 56 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:07:34.672 [2024-07-12 21:30:13.375083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:40000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.375109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.375224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.375241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.672 #57 NEW cov: 11750 ft: 15054 corp: 40/989b lim: 40 exec/s: 57 rss: 70Mb L: 21/40 MS: 1 ChangeBit- 00:07:34.672 [2024-07-12 21:30:13.415385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:04000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.415411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.415538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:fc000800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.415556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.672 [2024-07-12 21:30:13.415681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.672 [2024-07-12 21:30:13.415697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.954 #58 NEW cov: 11750 ft: 15065 corp: 41/1016b lim: 40 exec/s: 29 rss: 70Mb L: 27/40 MS: 1 ShuffleBytes- 00:07:34.954 #58 DONE cov: 11750 ft: 15065 corp: 41/1016b lim: 40 exec/s: 29 rss: 70Mb 00:07:34.954 ###### Recommended dictionary. ###### 00:07:34.954 "\263\355\032\026U\375(\000" # Uses: 1 00:07:34.954 ###### End of recommended dictionary. ###### 00:07:34.954 Done 58 runs in 2 second(s) 00:07:34.954 21:30:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:34.954 21:30:13 -- ../common.sh@72 -- # (( i++ )) 00:07:34.954 21:30:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.954 21:30:13 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:34.954 21:30:13 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:34.954 21:30:13 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.954 21:30:13 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.954 21:30:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:34.954 21:30:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:34.954 21:30:13 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:34.954 21:30:13 -- nvmf/run.sh@29 -- # port=4412 00:07:34.954 21:30:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:34.954 21:30:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:34.954 21:30:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.954 21:30:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:34.954 [2024-07-12 21:30:13.608124] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:34.954 [2024-07-12 21:30:13.608211] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3582996 ] 00:07:34.954 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.213 [2024-07-12 21:30:13.786369] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.213 [2024-07-12 21:30:13.849676] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.213 [2024-07-12 21:30:13.849794] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.213 [2024-07-12 21:30:13.907714] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.213 [2024-07-12 21:30:13.923970] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:35.213 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.213 INFO: Seed: 2199475001 00:07:35.213 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:35.213 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:35.213 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:35.213 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.213 #2 INITED exec/s: 0 rss: 60Mb 00:07:35.213 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.213 This may also happen if the target rejected all inputs we tried so far 00:07:35.213 [2024-07-12 21:30:13.969680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.213 [2024-07-12 21:30:13.969708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.213 [2024-07-12 21:30:13.969764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.213 [2024-07-12 21:30:13.969778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.213 [2024-07-12 21:30:13.969832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.213 [2024-07-12 21:30:13.969845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.213 [2024-07-12 21:30:13.969898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.213 [2024-07-12 21:30:13.969911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.732 NEW_FUNC[1/671]: 0x4915d0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:35.732 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.732 #4 NEW cov: 11521 ft: 11503 corp: 2/38b lim: 40 exec/s: 0 rss: 67Mb L: 37/37 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:35.732 [2024-07-12 21:30:14.280529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.280561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.280635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.280650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.280706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.280720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.280779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44447a44 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.280792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.732 #5 NEW cov: 11634 ft: 12110 corp: 3/76b lim: 40 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 InsertByte- 00:07:35.732 [2024-07-12 21:30:14.330281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.330308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.330364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.330378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.732 #11 NEW cov: 11640 ft: 12764 corp: 4/97b lim: 40 exec/s: 0 rss: 67Mb L: 21/38 MS: 1 InsertRepeatedBytes- 00:07:35.732 [2024-07-12 21:30:14.370703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:4444ff44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.370730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.370803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.370818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.370877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.370890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.370945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:4444447a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.370959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.732 #12 NEW cov: 11725 ft: 13101 corp: 5/136b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 InsertByte- 00:07:35.732 [2024-07-12 21:30:14.410539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.732 [2024-07-12 21:30:14.410564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 [2024-07-12 21:30:14.410623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeb4aeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.733 [2024-07-12 21:30:14.410636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.733 #13 NEW cov: 11725 ft: 13140 corp: 6/157b lim: 40 exec/s: 0 rss: 67Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:35.733 [2024-07-12 21:30:14.450655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.733 [2024-07-12 21:30:14.450680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.733 [2024-07-12 21:30:14.450753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.733 [2024-07-12 21:30:14.450767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.733 #14 NEW cov: 11725 ft: 13239 corp: 7/177b lim: 40 exec/s: 0 rss: 67Mb L: 20/39 MS: 1 EraseBytes- 00:07:35.733 [2024-07-12 21:30:14.491046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.733 [2024-07-12 21:30:14.491071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.733 [2024-07-12 21:30:14.491129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.733 [2024-07-12 21:30:14.491142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.733 [2024-07-12 21:30:14.491201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.733 [2024-07-12 21:30:14.491214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.733 [2024-07-12 21:30:14.491272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444441 cdw11:44447a44 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.733 [2024-07-12 21:30:14.491286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.733 #20 NEW cov: 11725 ft: 13305 corp: 8/215b lim: 40 exec/s: 0 rss: 67Mb L: 38/39 MS: 1 ChangeByte- 00:07:35.992 [2024-07-12 21:30:14.530687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.992 [2024-07-12 21:30:14.530712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.992 #21 NEW cov: 11725 ft: 14116 corp: 9/230b lim: 40 exec/s: 0 rss: 67Mb L: 15/39 MS: 1 InsertRepeatedBytes- 00:07:35.992 [2024-07-12 21:30:14.570924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0008aeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.992 [2024-07-12 21:30:14.570948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.992 [2024-07-12 21:30:14.571007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.571021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.993 #22 NEW cov: 11725 ft: 14186 corp: 10/250b lim: 40 exec/s: 0 rss: 68Mb L: 20/39 MS: 1 CMP- DE: "\000\010"- 00:07:35.993 [2024-07-12 21:30:14.611067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.611091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.611149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.611163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.993 #23 NEW cov: 11725 ft: 14217 corp: 11/270b lim: 40 exec/s: 0 rss: 68Mb L: 20/39 MS: 1 ChangeBit- 00:07:35.993 [2024-07-12 21:30:14.651527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:4444ff44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.651552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.651616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.651631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.651687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44443d44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.651700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.651755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:4444447a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.651768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.993 #24 NEW cov: 11725 ft: 14299 corp: 12/309b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:35.993 [2024-07-12 21:30:14.691715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.691740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.691799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.691813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.691870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:4444448b cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.691884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.691954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.691968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.993 #25 NEW cov: 11725 ft: 14314 corp: 13/346b lim: 40 exec/s: 0 rss: 68Mb L: 37/39 MS: 1 ChangeByte- 00:07:35.993 [2024-07-12 21:30:14.731417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.731448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.731505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff1a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.731519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.993 #27 NEW cov: 11725 ft: 14323 corp: 14/362b lim: 40 exec/s: 0 rss: 68Mb L: 16/39 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:35.993 [2024-07-12 21:30:14.771549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.771575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.993 [2024-07-12 21:30:14.771635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0008aeae cdw11:ae444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.993 [2024-07-12 21:30:14.771648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.253 #28 NEW cov: 11725 ft: 14416 corp: 15/384b lim: 40 exec/s: 0 rss: 68Mb L: 22/39 MS: 1 CrossOver- 00:07:36.253 [2024-07-12 21:30:14.811665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.811690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:14.811762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeb4aeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.811776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.253 #29 NEW cov: 11725 ft: 14436 corp: 16/405b lim: 40 exec/s: 0 rss: 68Mb L: 21/39 MS: 1 ShuffleBytes- 00:07:36.253 [2024-07-12 21:30:14.851927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0008aeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.851952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:14.852028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.852043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:14.852101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.852114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.253 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:36.253 #30 NEW cov: 11748 ft: 14654 corp: 17/430b lim: 40 exec/s: 0 rss: 68Mb L: 25/39 MS: 1 CopyPart- 00:07:36.253 [2024-07-12 21:30:14.892167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.892191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:14.892266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.892280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:14.892337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.892351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:14.892407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.892420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.253 #31 NEW cov: 11748 ft: 14687 corp: 18/466b lim: 40 exec/s: 0 rss: 68Mb L: 36/39 MS: 1 InsertRepeatedBytes- 00:07:36.253 [2024-07-12 21:30:14.931886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.931911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.253 #32 NEW cov: 11748 ft: 14723 corp: 19/478b lim: 40 exec/s: 32 rss: 68Mb L: 12/39 MS: 1 CrossOver- 00:07:36.253 [2024-07-12 21:30:14.972150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.972174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:14.972249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeae5248 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:14.972264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.253 #33 NEW cov: 11748 ft: 14735 corp: 20/499b lim: 40 exec/s: 33 rss: 68Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:36.253 [2024-07-12 21:30:15.012596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:4444fd44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:15.012621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:15.012695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:15.012709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:15.012766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:15.012780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.253 [2024-07-12 21:30:15.012836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:4444447a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.253 [2024-07-12 21:30:15.012850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.253 #34 NEW cov: 11748 ft: 14738 corp: 21/538b lim: 40 exec/s: 34 rss: 68Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:36.512 [2024-07-12 21:30:15.052362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:4444aeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.052386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.052461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44aeae44 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.052476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.513 #35 NEW cov: 11748 ft: 14760 corp: 22/555b lim: 40 exec/s: 35 rss: 68Mb L: 17/39 MS: 1 EraseBytes- 00:07:36.513 [2024-07-12 21:30:15.092811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.092835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.092911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.092925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.092981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.092995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.093055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.093068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.513 #36 NEW cov: 11748 ft: 14773 corp: 23/592b lim: 40 exec/s: 36 rss: 68Mb L: 37/39 MS: 1 ShuffleBytes- 00:07:36.513 [2024-07-12 21:30:15.132973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.132997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.133056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.133070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.133127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.133140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.133196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.133210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.513 #37 NEW cov: 11748 ft: 14817 corp: 24/628b lim: 40 exec/s: 37 rss: 68Mb L: 36/39 MS: 1 CopyPart- 00:07:36.513 [2024-07-12 21:30:15.172690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:0008aeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.172715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.172775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.172788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.513 #38 NEW cov: 11748 ft: 14836 corp: 25/648b lim: 40 exec/s: 38 rss: 68Mb L: 20/39 MS: 1 PersAutoDict- DE: "\000\010"- 00:07:36.513 [2024-07-12 21:30:15.212668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.212693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.212752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:28aeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.212766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.513 #39 NEW cov: 11748 ft: 14888 corp: 26/669b lim: 40 exec/s: 39 rss: 68Mb L: 21/39 MS: 1 InsertByte- 00:07:36.513 [2024-07-12 21:30:15.253196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.253221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.253295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.253313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.253372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.253385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.253445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444441 cdw11:44447a44 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.253459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.513 #40 NEW cov: 11748 ft: 14907 corp: 27/707b lim: 40 exec/s: 40 rss: 69Mb L: 38/39 MS: 1 ChangeByte- 00:07:36.513 [2024-07-12 21:30:15.293036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:4444aeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.293060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.513 [2024-07-12 21:30:15.293118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44aeae40 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.513 [2024-07-12 21:30:15.293133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.773 #41 NEW cov: 11748 ft: 14919 corp: 28/724b lim: 40 exec/s: 41 rss: 69Mb L: 17/39 MS: 1 ChangeBit- 00:07:36.773 [2024-07-12 21:30:15.333013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.333037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.773 #42 NEW cov: 11748 ft: 14931 corp: 29/736b lim: 40 exec/s: 42 rss: 69Mb L: 12/39 MS: 1 EraseBytes- 00:07:36.773 [2024-07-12 21:30:15.373307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.373331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.373390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeae5248 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.373403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.773 #43 NEW cov: 11748 ft: 14935 corp: 30/758b lim: 40 exec/s: 43 rss: 69Mb L: 22/39 MS: 1 InsertByte- 00:07:36.773 [2024-07-12 21:30:15.413390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.413414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.413474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffdfff1a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.413504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.773 #44 NEW cov: 11748 ft: 14956 corp: 31/774b lim: 40 exec/s: 44 rss: 69Mb L: 16/39 MS: 1 ChangeBit- 00:07:36.773 [2024-07-12 21:30:15.453867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.453892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.453970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.453985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.454043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.454057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.454124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444461 cdw11:44447a44 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.454138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.773 #45 NEW cov: 11748 ft: 14971 corp: 32/812b lim: 40 exec/s: 45 rss: 69Mb L: 38/39 MS: 1 ChangeBit- 00:07:36.773 [2024-07-12 21:30:15.493973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:4444ff44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.493998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.494057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.494071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.494128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.494142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.494197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.494211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.773 #46 NEW cov: 11748 ft: 15007 corp: 33/845b lim: 40 exec/s: 46 rss: 69Mb L: 33/39 MS: 1 EraseBytes- 00:07:36.773 [2024-07-12 21:30:15.534075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.534099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.534157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.534171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.534231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:4444444c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.534245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.773 [2024-07-12 21:30:15.534303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.773 [2024-07-12 21:30:15.534316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.773 #47 NEW cov: 11748 ft: 15036 corp: 34/882b lim: 40 exec/s: 47 rss: 69Mb L: 37/39 MS: 1 ChangeBinInt- 00:07:37.034 [2024-07-12 21:30:15.574237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.574261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.574320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.574334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.574390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44000844 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.574404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.574509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444441 cdw11:44447a44 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.574524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.034 #48 NEW cov: 11748 ft: 15042 corp: 35/920b lim: 40 exec/s: 48 rss: 69Mb L: 38/39 MS: 1 PersAutoDict- DE: "\000\010"- 00:07:37.034 [2024-07-12 21:30:15.614034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.614060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.614120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.614134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.034 #49 NEW cov: 11748 ft: 15045 corp: 36/938b lim: 40 exec/s: 49 rss: 69Mb L: 18/39 MS: 1 EraseBytes- 00:07:37.034 [2024-07-12 21:30:15.654142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.654168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.654231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:aeaeaeae cdw11:aeae1248 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.654247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.034 #50 NEW cov: 11748 ft: 15058 corp: 37/960b lim: 40 exec/s: 50 rss: 69Mb L: 22/39 MS: 1 ChangeBit- 00:07:37.034 [2024-07-12 21:30:15.694614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.694639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.694700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.694714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.694771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.694785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.694845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.694859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.034 #51 NEW cov: 11748 ft: 15072 corp: 38/997b lim: 40 exec/s: 51 rss: 69Mb L: 37/39 MS: 1 ChangeBinInt- 00:07:37.034 [2024-07-12 21:30:15.734353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.734378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.734457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:beaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.734471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.034 #52 NEW cov: 11748 ft: 15132 corp: 39/1015b lim: 40 exec/s: 52 rss: 70Mb L: 18/39 MS: 1 ChangeBit- 00:07:37.034 [2024-07-12 21:30:15.774674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:4444ff44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.774699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.774761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.774775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.774835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44443d44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.774848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.034 #53 NEW cov: 11748 ft: 15136 corp: 40/1042b lim: 40 exec/s: 53 rss: 70Mb L: 27/39 MS: 1 EraseBytes- 00:07:37.034 [2024-07-12 21:30:15.815163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:4444fd44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.815190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.815254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.815268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.815330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.815343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.815406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.815420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.034 [2024-07-12 21:30:15.815493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:7a444444 cdw11:4444441d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.034 [2024-07-12 21:30:15.815508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.294 #54 NEW cov: 11748 ft: 15205 corp: 41/1082b lim: 40 exec/s: 54 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:07:37.294 [2024-07-12 21:30:15.855076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:44444444 cdw11:04444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.294 [2024-07-12 21:30:15.855100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.294 [2024-07-12 21:30:15.855177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.294 [2024-07-12 21:30:15.855191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.294 [2024-07-12 21:30:15.855252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.294 [2024-07-12 21:30:15.855266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.294 [2024-07-12 21:30:15.855324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.294 [2024-07-12 21:30:15.855337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.294 #55 NEW cov: 11748 ft: 15207 corp: 42/1119b lim: 40 exec/s: 55 rss: 70Mb L: 37/40 MS: 1 ChangeBit- 00:07:37.295 [2024-07-12 21:30:15.894964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:aeaeaeae cdw11:aeffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.295 [2024-07-12 21:30:15.894989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.295 [2024-07-12 21:30:15.895049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.295 [2024-07-12 21:30:15.895064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.295 [2024-07-12 21:30:15.895126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:aeaeaeae cdw11:aeaeaeae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.295 [2024-07-12 21:30:15.895139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.295 #56 NEW cov: 11748 ft: 15217 corp: 43/1147b lim: 40 exec/s: 56 rss: 70Mb L: 28/40 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:37.295 [2024-07-12 21:30:15.935291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:4444fa44 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.295 [2024-07-12 21:30:15.935317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.295 [2024-07-12 21:30:15.935390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.295 [2024-07-12 21:30:15.935404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.295 [2024-07-12 21:30:15.935459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.295 [2024-07-12 21:30:15.935473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.295 [2024-07-12 21:30:15.935528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:44444444 cdw11:44444444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.295 [2024-07-12 21:30:15.935544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.295 #57 NEW cov: 11748 ft: 15235 corp: 44/1180b lim: 40 exec/s: 28 rss: 70Mb L: 33/40 MS: 1 ChangeBinInt- 00:07:37.295 #57 DONE cov: 11748 ft: 15235 corp: 44/1180b lim: 40 exec/s: 28 rss: 70Mb 00:07:37.295 ###### Recommended dictionary. ###### 00:07:37.295 "\000\010" # Uses: 2 00:07:37.295 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:37.295 ###### End of recommended dictionary. ###### 00:07:37.295 Done 57 runs in 2 second(s) 00:07:37.555 21:30:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:37.555 21:30:16 -- ../common.sh@72 -- # (( i++ )) 00:07:37.555 21:30:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.555 21:30:16 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:37.555 21:30:16 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:37.555 21:30:16 -- nvmf/run.sh@24 -- # local timen=1 00:07:37.555 21:30:16 -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.555 21:30:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:37.555 21:30:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:37.555 21:30:16 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:37.555 21:30:16 -- nvmf/run.sh@29 -- # port=4413 00:07:37.555 21:30:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:37.555 21:30:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:37.555 21:30:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.555 21:30:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:37.555 [2024-07-12 21:30:16.127176] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:37.555 [2024-07-12 21:30:16.127241] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3583516 ] 00:07:37.555 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.555 [2024-07-12 21:30:16.303862] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.814 [2024-07-12 21:30:16.367556] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.814 [2024-07-12 21:30:16.367699] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.814 [2024-07-12 21:30:16.425551] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.814 [2024-07-12 21:30:16.441842] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:37.814 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.814 INFO: Seed: 420517193 00:07:37.814 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:37.814 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:37.814 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:37.814 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.814 #2 INITED exec/s: 0 rss: 60Mb 00:07:37.814 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.814 This may also happen if the target rejected all inputs we tried so far 00:07:37.814 [2024-07-12 21:30:16.491020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.814 [2024-07-12 21:30:16.491048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.814 [2024-07-12 21:30:16.491123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.814 [2024-07-12 21:30:16.491140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.074 NEW_FUNC[1/670]: 0x493190 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:38.074 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.074 #5 NEW cov: 11509 ft: 11510 corp: 2/21b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:38.074 [2024-07-12 21:30:16.801779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.074 [2024-07-12 21:30:16.801812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.074 [2024-07-12 21:30:16.801877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.074 [2024-07-12 21:30:16.801892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.074 #16 NEW cov: 11622 ft: 11986 corp: 3/41b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeByte- 00:07:38.074 [2024-07-12 21:30:16.841648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.074 [2024-07-12 21:30:16.841673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.333 #19 NEW cov: 11628 ft: 12665 corp: 4/53b lim: 40 exec/s: 0 rss: 67Mb L: 12/20 MS: 3 ChangeBit-CMP-CrossOver- DE: "\000\000\000\001"- 00:07:38.333 [2024-07-12 21:30:16.881784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.333 [2024-07-12 21:30:16.881809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.333 #20 NEW cov: 11713 ft: 12938 corp: 5/65b lim: 40 exec/s: 0 rss: 67Mb L: 12/20 MS: 1 CopyPart- 00:07:38.333 [2024-07-12 21:30:16.921861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff66ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.333 [2024-07-12 21:30:16.921885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.333 #21 NEW cov: 11713 ft: 13057 corp: 6/77b lim: 40 exec/s: 0 rss: 67Mb L: 12/20 MS: 1 ChangeByte- 00:07:38.333 [2024-07-12 21:30:16.962294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.333 [2024-07-12 21:30:16.962319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.333 [2024-07-12 21:30:16.962377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.333 [2024-07-12 21:30:16.962391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.333 [2024-07-12 21:30:16.962456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.333 [2024-07-12 21:30:16.962469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.333 #22 NEW cov: 11713 ft: 13417 corp: 7/108b lim: 40 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:38.333 [2024-07-12 21:30:17.002306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.334 [2024-07-12 21:30:17.002334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.334 [2024-07-12 21:30:17.002397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.334 [2024-07-12 21:30:17.002410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.334 #23 NEW cov: 11713 ft: 13498 corp: 8/129b lim: 40 exec/s: 0 rss: 68Mb L: 21/31 MS: 1 InsertByte- 00:07:38.334 [2024-07-12 21:30:17.042519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.334 [2024-07-12 21:30:17.042544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.334 [2024-07-12 21:30:17.042617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.334 [2024-07-12 21:30:17.042632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.334 [2024-07-12 21:30:17.042688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff4a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.334 [2024-07-12 21:30:17.042702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.334 #24 NEW cov: 11713 ft: 13549 corp: 9/153b lim: 40 exec/s: 0 rss: 68Mb L: 24/31 MS: 1 CopyPart- 00:07:38.334 [2024-07-12 21:30:17.082538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.334 [2024-07-12 21:30:17.082563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.334 [2024-07-12 21:30:17.082622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.334 [2024-07-12 21:30:17.082635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.334 #25 NEW cov: 11713 ft: 13594 corp: 10/175b lim: 40 exec/s: 0 rss: 68Mb L: 22/31 MS: 1 InsertByte- 00:07:38.593 [2024-07-12 21:30:17.122682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.122706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.593 [2024-07-12 21:30:17.122783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.122798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.593 #26 NEW cov: 11713 ft: 13717 corp: 11/198b lim: 40 exec/s: 0 rss: 68Mb L: 23/31 MS: 1 InsertByte- 00:07:38.593 [2024-07-12 21:30:17.162643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:3fffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.162669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.593 #27 NEW cov: 11713 ft: 13793 corp: 12/211b lim: 40 exec/s: 0 rss: 68Mb L: 13/31 MS: 1 InsertByte- 00:07:38.593 [2024-07-12 21:30:17.203160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.203188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.593 [2024-07-12 21:30:17.203265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.203280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.593 [2024-07-12 21:30:17.203338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.203352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.593 [2024-07-12 21:30:17.203412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.203428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.593 #28 NEW cov: 11713 ft: 14268 corp: 13/244b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 CrossOver- 00:07:38.593 [2024-07-12 21:30:17.242895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff66ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.242920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.593 #29 NEW cov: 11713 ft: 14285 corp: 14/256b lim: 40 exec/s: 0 rss: 68Mb L: 12/33 MS: 1 ChangeByte- 00:07:38.593 [2024-07-12 21:30:17.282996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0005ffff cdw11:ff3fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.283022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.593 #30 NEW cov: 11713 ft: 14307 corp: 15/270b lim: 40 exec/s: 0 rss: 68Mb L: 14/33 MS: 1 InsertByte- 00:07:38.593 [2024-07-12 21:30:17.323100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.323125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.593 #33 NEW cov: 11713 ft: 14384 corp: 16/285b lim: 40 exec/s: 0 rss: 68Mb L: 15/33 MS: 3 CrossOver-ShuffleBytes-CrossOver- 00:07:38.593 [2024-07-12 21:30:17.363350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ffff2fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.363374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.593 [2024-07-12 21:30:17.363449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.593 [2024-07-12 21:30:17.363464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.853 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.853 #34 NEW cov: 11736 ft: 14459 corp: 17/301b lim: 40 exec/s: 0 rss: 68Mb L: 16/33 MS: 1 InsertByte- 00:07:38.853 [2024-07-12 21:30:17.403452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.403477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.403530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.403547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.853 #35 NEW cov: 11736 ft: 14471 corp: 18/324b lim: 40 exec/s: 0 rss: 69Mb L: 23/33 MS: 1 CopyPart- 00:07:38.853 [2024-07-12 21:30:17.443579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff3fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.443604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.443681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.443696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.853 #36 NEW cov: 11736 ft: 14481 corp: 19/345b lim: 40 exec/s: 0 rss: 69Mb L: 21/33 MS: 1 InsertByte- 00:07:38.853 [2024-07-12 21:30:17.483684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:66ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.483710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.483770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:66ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.483783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.853 #37 NEW cov: 11736 ft: 14493 corp: 20/364b lim: 40 exec/s: 37 rss: 69Mb L: 19/33 MS: 1 CopyPart- 00:07:38.853 [2024-07-12 21:30:17.523910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.523935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.524011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.524025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.524087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:21ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.524100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.853 #38 NEW cov: 11736 ft: 14525 corp: 21/389b lim: 40 exec/s: 38 rss: 69Mb L: 25/33 MS: 1 InsertByte- 00:07:38.853 [2024-07-12 21:30:17.564037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff6600ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.564062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.564125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff66 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.564139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.564199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ff00ffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.564213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.853 #39 NEW cov: 11736 ft: 14533 corp: 22/413b lim: 40 exec/s: 39 rss: 69Mb L: 24/33 MS: 1 CopyPart- 00:07:38.853 [2024-07-12 21:30:17.604145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffd6d6d6 cdw11:d6ffffea SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.604170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.604244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.604259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.853 [2024-07-12 21:30:17.604319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffbfffff cdw11:ffffff25 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.853 [2024-07-12 21:30:17.604332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.853 #40 NEW cov: 11736 ft: 14624 corp: 23/440b lim: 40 exec/s: 40 rss: 69Mb L: 27/33 MS: 1 InsertRepeatedBytes- 00:07:39.113 [2024-07-12 21:30:17.644425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.644454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.644537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.644550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.644623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.644637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.644696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.644709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.113 #41 NEW cov: 11736 ft: 14636 corp: 24/476b lim: 40 exec/s: 41 rss: 69Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:39.113 [2024-07-12 21:30:17.684232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffeaffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.684257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.684332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.684346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.113 #42 NEW cov: 11736 ft: 14649 corp: 25/499b lim: 40 exec/s: 42 rss: 69Mb L: 23/36 MS: 1 CopyPart- 00:07:39.113 [2024-07-12 21:30:17.724805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.724829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.724888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.724905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.724964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.724977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.725034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.725047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.725103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.725116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.113 #46 NEW cov: 11736 ft: 14693 corp: 26/539b lim: 40 exec/s: 46 rss: 69Mb L: 40/40 MS: 4 ShuffleBytes-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:07:39.113 [2024-07-12 21:30:17.764492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.764516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.764572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.764585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.113 #47 NEW cov: 11736 ft: 14770 corp: 27/561b lim: 40 exec/s: 47 rss: 69Mb L: 22/40 MS: 1 ShuffleBytes- 00:07:39.113 [2024-07-12 21:30:17.804619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffeaffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.804642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.804704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.804718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.113 #48 NEW cov: 11736 ft: 14782 corp: 28/584b lim: 40 exec/s: 48 rss: 69Mb L: 23/40 MS: 1 CopyPart- 00:07:39.113 [2024-07-12 21:30:17.844748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.844772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.844834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:bfffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.844848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.113 #49 NEW cov: 11736 ft: 14791 corp: 29/606b lim: 40 exec/s: 49 rss: 69Mb L: 22/40 MS: 1 ChangeBit- 00:07:39.113 [2024-07-12 21:30:17.884991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.885015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.885080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.885093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.113 [2024-07-12 21:30:17.885152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.113 [2024-07-12 21:30:17.885166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.373 #50 NEW cov: 11736 ft: 14857 corp: 30/637b lim: 40 exec/s: 50 rss: 69Mb L: 31/40 MS: 1 CrossOver- 00:07:39.373 [2024-07-12 21:30:17.925227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:17.925251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:17.925311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:17.925324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:17.925381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:17.925394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:17.925456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffff00 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:17.925486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.373 #51 NEW cov: 11736 ft: 14865 corp: 31/673b lim: 40 exec/s: 51 rss: 69Mb L: 36/40 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\000"- 00:07:39.373 [2024-07-12 21:30:17.965268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:17.965292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:17.965351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:17.965365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:17.965423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:17.965437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.373 #52 NEW cov: 11736 ft: 14916 corp: 32/702b lim: 40 exec/s: 52 rss: 69Mb L: 29/40 MS: 1 InsertRepeatedBytes- 00:07:39.373 [2024-07-12 21:30:18.005493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.005518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.005591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:bfffffff cdw11:ffffffea SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.005611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.005670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:bfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.005683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.005740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.005754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.373 #53 NEW cov: 11736 ft: 14920 corp: 33/740b lim: 40 exec/s: 53 rss: 69Mb L: 38/40 MS: 1 CopyPart- 00:07:39.373 [2024-07-12 21:30:18.045446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.045470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.045541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffbfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.045554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.045615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffff25 cdw11:2cff404a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.045629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.373 #54 NEW cov: 11736 ft: 14921 corp: 34/764b lim: 40 exec/s: 54 rss: 69Mb L: 24/40 MS: 1 InsertByte- 00:07:39.373 [2024-07-12 21:30:18.075544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.075570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.075631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.075645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.075700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.075714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.373 #55 NEW cov: 11736 ft: 14941 corp: 35/790b lim: 40 exec/s: 55 rss: 69Mb L: 26/40 MS: 1 EraseBytes- 00:07:39.373 [2024-07-12 21:30:18.115687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.115712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.115767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.115781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.373 [2024-07-12 21:30:18.115841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ff32ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.373 [2024-07-12 21:30:18.115857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.373 #56 NEW cov: 11736 ft: 14949 corp: 36/815b lim: 40 exec/s: 56 rss: 69Mb L: 25/40 MS: 1 InsertByte- 00:07:39.633 [2024-07-12 21:30:18.156096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.156122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.156179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.156193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.156252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.156265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.156323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.156336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.156392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.156406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.633 #57 NEW cov: 11736 ft: 14964 corp: 37/855b lim: 40 exec/s: 57 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:39.633 [2024-07-12 21:30:18.195788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fffffbea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.195812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.195873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.195886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.633 #58 NEW cov: 11736 ft: 14971 corp: 38/877b lim: 40 exec/s: 58 rss: 69Mb L: 22/40 MS: 1 ChangeBit- 00:07:39.633 [2024-07-12 21:30:18.236018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.236043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.236118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.236132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.236190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.236204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.633 #59 NEW cov: 11736 ft: 14988 corp: 39/908b lim: 40 exec/s: 59 rss: 70Mb L: 31/40 MS: 1 ChangeBit- 00:07:39.633 [2024-07-12 21:30:18.276113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff000000 cdw11:01ffffea SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.276137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.276200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.276214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.276273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffbfffff cdw11:ffffff25 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.276287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.633 #60 NEW cov: 11736 ft: 14994 corp: 40/935b lim: 40 exec/s: 60 rss: 70Mb L: 27/40 MS: 1 PersAutoDict- DE: "\000\000\000\001"- 00:07:39.633 [2024-07-12 21:30:18.316339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff00 cdw11:faffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.316364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.316424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.316438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.633 [2024-07-12 21:30:18.316505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.633 [2024-07-12 21:30:18.316519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.634 [2024-07-12 21:30:18.316577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.634 [2024-07-12 21:30:18.316590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.634 #61 NEW cov: 11736 ft: 15066 corp: 41/971b lim: 40 exec/s: 61 rss: 70Mb L: 36/40 MS: 1 ChangeBinInt- 00:07:39.634 [2024-07-12 21:30:18.356211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff7fffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.634 [2024-07-12 21:30:18.356235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.634 [2024-07-12 21:30:18.356296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.634 [2024-07-12 21:30:18.356310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.634 #62 NEW cov: 11736 ft: 15083 corp: 42/991b lim: 40 exec/s: 62 rss: 70Mb L: 20/40 MS: 1 ChangeBit- 00:07:39.634 [2024-07-12 21:30:18.386276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fffffbea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.634 [2024-07-12 21:30:18.386300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.634 [2024-07-12 21:30:18.386375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fffffffb cdw11:eaffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.634 [2024-07-12 21:30:18.386393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.634 #63 NEW cov: 11736 ft: 15098 corp: 43/1013b lim: 40 exec/s: 63 rss: 70Mb L: 22/40 MS: 1 CopyPart- 00:07:39.893 [2024-07-12 21:30:18.426576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff000000 cdw11:bfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.893 [2024-07-12 21:30:18.426600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.893 [2024-07-12 21:30:18.426663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff252c cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.893 [2024-07-12 21:30:18.426676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.893 [2024-07-12 21:30:18.426735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffbfffff cdw11:ffffff25 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.893 [2024-07-12 21:30:18.426748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.893 #64 NEW cov: 11736 ft: 15106 corp: 44/1040b lim: 40 exec/s: 64 rss: 70Mb L: 27/40 MS: 1 CopyPart- 00:07:39.893 [2024-07-12 21:30:18.466805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.893 [2024-07-12 21:30:18.466828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.893 [2024-07-12 21:30:18.466887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.893 [2024-07-12 21:30:18.466901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.893 [2024-07-12 21:30:18.466973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ff32ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.893 [2024-07-12 21:30:18.466986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.893 [2024-07-12 21:30:18.467046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:4aff27fd cdw11:535556f5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.893 [2024-07-12 21:30:18.467060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.893 #65 NEW cov: 11736 ft: 15130 corp: 45/1073b lim: 40 exec/s: 32 rss: 70Mb L: 33/40 MS: 1 CMP- DE: "\377'\375SUV\365\342"- 00:07:39.893 #65 DONE cov: 11736 ft: 15130 corp: 45/1073b lim: 40 exec/s: 32 rss: 70Mb 00:07:39.893 ###### Recommended dictionary. ###### 00:07:39.893 "\000\000\000\001" # Uses: 1 00:07:39.893 "\377\377\377\377\377\377\377\000" # Uses: 0 00:07:39.893 "\377'\375SUV\365\342" # Uses: 0 00:07:39.893 ###### End of recommended dictionary. ###### 00:07:39.893 Done 65 runs in 2 second(s) 00:07:39.893 21:30:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:39.893 21:30:18 -- ../common.sh@72 -- # (( i++ )) 00:07:39.893 21:30:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.893 21:30:18 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:39.893 21:30:18 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:39.893 21:30:18 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.893 21:30:18 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.893 21:30:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:39.893 21:30:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:39.893 21:30:18 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:39.893 21:30:18 -- nvmf/run.sh@29 -- # port=4414 00:07:39.893 21:30:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:39.893 21:30:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:39.894 21:30:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.894 21:30:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:39.894 [2024-07-12 21:30:18.656107] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:39.894 [2024-07-12 21:30:18.656176] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3583832 ] 00:07:40.153 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.153 [2024-07-12 21:30:18.840264] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.153 [2024-07-12 21:30:18.904119] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.153 [2024-07-12 21:30:18.904244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.412 [2024-07-12 21:30:18.962432] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.412 [2024-07-12 21:30:18.978734] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:40.412 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.412 INFO: Seed: 2958526597 00:07:40.412 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:40.412 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:40.412 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:40.412 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.412 #2 INITED exec/s: 0 rss: 60Mb 00:07:40.412 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.412 This may also happen if the target rejected all inputs we tried so far 00:07:40.412 [2024-07-12 21:30:19.044722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.412 [2024-07-12 21:30:19.044759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.671 NEW_FUNC[1/671]: 0x494d50 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:40.671 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.671 #7 NEW cov: 11503 ft: 11498 corp: 2/9b lim: 35 exec/s: 0 rss: 67Mb L: 8/8 MS: 5 InsertByte-ChangeByte-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:40.671 [2024-07-12 21:30:19.385558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.671 [2024-07-12 21:30:19.385600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.671 #11 NEW cov: 11623 ft: 12142 corp: 3/18b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 4 ChangeByte-ShuffleBytes-ChangeByte-CrossOver- 00:07:40.671 [2024-07-12 21:30:19.426021] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.671 [2024-07-12 21:30:19.426052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.671 NEW_FUNC[1/2]: 0x4b60f0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:40.671 NEW_FUNC[2/2]: 0x1153fb0 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:07:40.671 #13 NEW cov: 11662 ft: 12990 corp: 4/34b lim: 35 exec/s: 0 rss: 67Mb L: 16/16 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:40.930 #14 NEW cov: 11747 ft: 13287 corp: 5/46b lim: 35 exec/s: 0 rss: 67Mb L: 12/16 MS: 1 EraseBytes- 00:07:40.930 [2024-07-12 21:30:19.506214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.930 [2024-07-12 21:30:19.506243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.930 #15 NEW cov: 11747 ft: 13356 corp: 6/62b lim: 35 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 ChangeByte- 00:07:40.930 [2024-07-12 21:30:19.546424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.930 [2024-07-12 21:30:19.546457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.930 #16 NEW cov: 11747 ft: 13444 corp: 7/78b lim: 35 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 ChangeBit- 00:07:40.930 [2024-07-12 21:30:19.586565] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.930 [2024-07-12 21:30:19.586593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.930 NEW_FUNC[1/1]: 0x4b65c0 in feat_async_event_cfg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:346 00:07:40.930 #17 NEW cov: 11847 ft: 13615 corp: 8/94b lim: 35 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 ChangeBinInt- 00:07:40.930 [2024-07-12 21:30:19.636397] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.930 [2024-07-12 21:30:19.636423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.930 #18 NEW cov: 11847 ft: 13786 corp: 9/102b lim: 35 exec/s: 0 rss: 67Mb L: 8/16 MS: 1 ChangeBinInt- 00:07:40.930 [2024-07-12 21:30:19.676398] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.930 [2024-07-12 21:30:19.676425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.930 #19 NEW cov: 11847 ft: 13815 corp: 10/118b lim: 35 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 ChangeByte- 00:07:41.190 [2024-07-12 21:30:19.716931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.190 [2024-07-12 21:30:19.716960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.190 #20 NEW cov: 11847 ft: 13839 corp: 11/134b lim: 35 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 ShuffleBytes- 00:07:41.190 [2024-07-12 21:30:19.776696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.190 [2024-07-12 21:30:19.776722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.190 #21 NEW cov: 11847 ft: 13896 corp: 12/146b lim: 35 exec/s: 0 rss: 68Mb L: 12/16 MS: 1 ShuffleBytes- 00:07:41.190 [2024-07-12 21:30:19.816742] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.190 [2024-07-12 21:30:19.816768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.190 #22 NEW cov: 11847 ft: 14012 corp: 13/153b lim: 35 exec/s: 0 rss: 68Mb L: 7/16 MS: 1 EraseBytes- 00:07:41.190 [2024-07-12 21:30:19.856498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.190 [2024-07-12 21:30:19.856524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.190 #23 NEW cov: 11847 ft: 14082 corp: 14/161b lim: 35 exec/s: 0 rss: 68Mb L: 8/16 MS: 1 ShuffleBytes- 00:07:41.190 [2024-07-12 21:30:19.907335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.190 [2024-07-12 21:30:19.907362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.190 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.190 #24 NEW cov: 11870 ft: 14118 corp: 15/178b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 InsertByte- 00:07:41.190 [2024-07-12 21:30:19.947559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.190 [2024-07-12 21:30:19.947588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.449 #25 NEW cov: 11870 ft: 14130 corp: 16/194b lim: 35 exec/s: 0 rss: 68Mb L: 16/17 MS: 1 ChangeBinInt- 00:07:41.449 [2024-07-12 21:30:19.997736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.449 [2024-07-12 21:30:19.997768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.449 #26 NEW cov: 11870 ft: 14148 corp: 17/210b lim: 35 exec/s: 26 rss: 68Mb L: 16/17 MS: 1 ChangeBit- 00:07:41.449 [2024-07-12 21:30:20.047107] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.449 [2024-07-12 21:30:20.047134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.449 #27 NEW cov: 11870 ft: 14162 corp: 18/217b lim: 35 exec/s: 27 rss: 68Mb L: 7/17 MS: 1 ChangeBinInt- 00:07:41.449 #28 NEW cov: 11870 ft: 14205 corp: 19/229b lim: 35 exec/s: 28 rss: 68Mb L: 12/17 MS: 1 ChangeBit- 00:07:41.449 [2024-07-12 21:30:20.138020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.449 [2024-07-12 21:30:20.138048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.449 [2024-07-12 21:30:20.138178] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.449 [2024-07-12 21:30:20.138195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.449 #29 NEW cov: 11870 ft: 14475 corp: 20/246b lim: 35 exec/s: 29 rss: 69Mb L: 17/17 MS: 1 CrossOver- 00:07:41.449 #30 NEW cov: 11870 ft: 14525 corp: 21/255b lim: 35 exec/s: 30 rss: 69Mb L: 9/17 MS: 1 EraseBytes- 00:07:41.449 [2024-07-12 21:30:20.218018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.449 [2024-07-12 21:30:20.218045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.708 #36 NEW cov: 11870 ft: 14536 corp: 22/264b lim: 35 exec/s: 36 rss: 69Mb L: 9/17 MS: 1 InsertByte- 00:07:41.708 [2024-07-12 21:30:20.258124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.708 [2024-07-12 21:30:20.258159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.708 #37 NEW cov: 11870 ft: 14560 corp: 23/271b lim: 35 exec/s: 37 rss: 69Mb L: 7/17 MS: 1 EraseBytes- 00:07:41.708 [2024-07-12 21:30:20.308607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.708 [2024-07-12 21:30:20.308635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.708 #38 NEW cov: 11870 ft: 14583 corp: 24/289b lim: 35 exec/s: 38 rss: 69Mb L: 18/18 MS: 1 CMP- DE: "\004\000"- 00:07:41.708 [2024-07-12 21:30:20.358781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.708 [2024-07-12 21:30:20.358812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.708 [2024-07-12 21:30:20.358951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.708 [2024-07-12 21:30:20.358971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.708 #39 NEW cov: 11870 ft: 14648 corp: 25/306b lim: 35 exec/s: 39 rss: 69Mb L: 17/18 MS: 1 ShuffleBytes- 00:07:41.708 [2024-07-12 21:30:20.418627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.708 [2024-07-12 21:30:20.418655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.708 #40 NEW cov: 11870 ft: 14664 corp: 26/314b lim: 35 exec/s: 40 rss: 69Mb L: 8/18 MS: 1 ChangeByte- 00:07:41.708 [2024-07-12 21:30:20.458724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.708 [2024-07-12 21:30:20.458752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.708 #41 NEW cov: 11870 ft: 14679 corp: 27/330b lim: 35 exec/s: 41 rss: 69Mb L: 16/18 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:41.967 [2024-07-12 21:30:20.509273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.967 [2024-07-12 21:30:20.509303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.967 #47 NEW cov: 11870 ft: 14691 corp: 28/349b lim: 35 exec/s: 47 rss: 69Mb L: 19/19 MS: 1 InsertByte- 00:07:41.967 [2024-07-12 21:30:20.559381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.967 [2024-07-12 21:30:20.559417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.967 #48 NEW cov: 11870 ft: 14712 corp: 29/365b lim: 35 exec/s: 48 rss: 69Mb L: 16/19 MS: 1 ChangeByte- 00:07:41.967 [2024-07-12 21:30:20.599261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.967 [2024-07-12 21:30:20.599287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.967 [2024-07-12 21:30:20.599435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.967 [2024-07-12 21:30:20.599457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.967 #49 NEW cov: 11870 ft: 14725 corp: 30/381b lim: 35 exec/s: 49 rss: 69Mb L: 16/19 MS: 1 CopyPart- 00:07:41.967 [2024-07-12 21:30:20.639502] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.967 [2024-07-12 21:30:20.639530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.967 [2024-07-12 21:30:20.639648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.967 [2024-07-12 21:30:20.639666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.967 #50 NEW cov: 11870 ft: 14751 corp: 31/397b lim: 35 exec/s: 50 rss: 70Mb L: 16/19 MS: 1 ShuffleBytes- 00:07:41.967 [2024-07-12 21:30:20.689324] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.967 [2024-07-12 21:30:20.689355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.967 #51 NEW cov: 11870 ft: 14768 corp: 32/408b lim: 35 exec/s: 51 rss: 70Mb L: 11/19 MS: 1 InsertRepeatedBytes- 00:07:42.227 #52 NEW cov: 11870 ft: 14791 corp: 33/416b lim: 35 exec/s: 52 rss: 70Mb L: 8/19 MS: 1 EraseBytes- 00:07:42.227 [2024-07-12 21:30:20.780031] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.780059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.227 #58 NEW cov: 11870 ft: 14810 corp: 34/433b lim: 35 exec/s: 58 rss: 70Mb L: 17/19 MS: 1 InsertByte- 00:07:42.227 [2024-07-12 21:30:20.830399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.830427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.227 [2024-07-12 21:30:20.830574] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.830592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.227 #59 NEW cov: 11870 ft: 14914 corp: 35/459b lim: 35 exec/s: 59 rss: 70Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:42.227 [2024-07-12 21:30:20.870113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000b2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.870140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.227 [2024-07-12 21:30:20.870269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.870286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.227 #60 NEW cov: 11870 ft: 14925 corp: 36/476b lim: 35 exec/s: 60 rss: 70Mb L: 17/26 MS: 1 ChangeByte- 00:07:42.227 [2024-07-12 21:30:20.910399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.910429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.227 #61 NEW cov: 11870 ft: 14986 corp: 37/492b lim: 35 exec/s: 61 rss: 70Mb L: 16/26 MS: 1 CopyPart- 00:07:42.227 [2024-07-12 21:30:20.950356] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.950381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.227 #62 NEW cov: 11870 ft: 14990 corp: 38/509b lim: 35 exec/s: 62 rss: 70Mb L: 17/26 MS: 1 ChangeBit- 00:07:42.227 [2024-07-12 21:30:20.990426] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.227 [2024-07-12 21:30:20.990456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.486 #68 NEW cov: 11870 ft: 15004 corp: 39/528b lim: 35 exec/s: 68 rss: 70Mb L: 19/26 MS: 1 InsertByte- 00:07:42.486 [2024-07-12 21:30:21.030623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.486 [2024-07-12 21:30:21.030650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.486 [2024-07-12 21:30:21.030803] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000007a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.486 [2024-07-12 21:30:21.030821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.486 #69 NEW cov: 11870 ft: 15006 corp: 40/545b lim: 35 exec/s: 34 rss: 70Mb L: 17/26 MS: 1 CrossOver- 00:07:42.486 #69 DONE cov: 11870 ft: 15006 corp: 40/545b lim: 35 exec/s: 34 rss: 70Mb 00:07:42.486 ###### Recommended dictionary. ###### 00:07:42.486 "\004\000" # Uses: 1 00:07:42.486 ###### End of recommended dictionary. ###### 00:07:42.486 Done 69 runs in 2 second(s) 00:07:42.486 21:30:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:42.486 21:30:21 -- ../common.sh@72 -- # (( i++ )) 00:07:42.486 21:30:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.486 21:30:21 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:42.486 21:30:21 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:42.486 21:30:21 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.486 21:30:21 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.486 21:30:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:42.486 21:30:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:42.487 21:30:21 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:42.487 21:30:21 -- nvmf/run.sh@29 -- # port=4415 00:07:42.487 21:30:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:42.487 21:30:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:42.487 21:30:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.487 21:30:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:42.487 [2024-07-12 21:30:21.220715] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:42.487 [2024-07-12 21:30:21.220790] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3584368 ] 00:07:42.487 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.746 [2024-07-12 21:30:21.400606] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.746 [2024-07-12 21:30:21.463370] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.746 [2024-07-12 21:30:21.463514] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.746 [2024-07-12 21:30:21.521332] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.005 [2024-07-12 21:30:21.537594] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:43.005 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.005 INFO: Seed: 1223550161 00:07:43.005 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:43.005 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:43.005 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:43.005 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.005 #2 INITED exec/s: 0 rss: 61Mb 00:07:43.005 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.005 This may also happen if the target rejected all inputs we tried so far 00:07:43.005 [2024-07-12 21:30:21.603151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.005 [2024-07-12 21:30:21.603181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.005 [2024-07-12 21:30:21.603242] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.005 [2024-07-12 21:30:21.603257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.005 [2024-07-12 21:30:21.603317] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.005 [2024-07-12 21:30:21.603337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.264 NEW_FUNC[1/669]: 0x496290 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:43.264 NEW_FUNC[2/669]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.264 #11 NEW cov: 11490 ft: 11477 corp: 2/22b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 4 CrossOver-ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:43.264 [2024-07-12 21:30:21.934129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.264 [2024-07-12 21:30:21.934172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.264 [2024-07-12 21:30:21.934241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:21.934260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.265 [2024-07-12 21:30:21.934330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:21.934348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.265 NEW_FUNC[1/1]: 0xf05630 in rte_rdtsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/include/rte_cycles.h:31 00:07:43.265 #12 NEW cov: 11604 ft: 12068 corp: 3/43b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 CopyPart- 00:07:43.265 [2024-07-12 21:30:21.984012] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:21.984039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.265 [2024-07-12 21:30:21.984096] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:21.984110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.265 [2024-07-12 21:30:21.984167] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:21.984180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.265 #13 NEW cov: 11610 ft: 12470 corp: 4/65b lim: 35 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 InsertByte- 00:07:43.265 [2024-07-12 21:30:22.024166] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:22.024192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.265 [2024-07-12 21:30:22.024253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:22.024267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.265 [2024-07-12 21:30:22.024329] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.265 [2024-07-12 21:30:22.024342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.524 #14 NEW cov: 11695 ft: 12676 corp: 5/86b lim: 35 exec/s: 0 rss: 67Mb L: 21/22 MS: 1 ChangeBit- 00:07:43.524 [2024-07-12 21:30:22.064379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.064407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.064493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.064507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.064565] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.064578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.064637] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.064650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.524 #15 NEW cov: 11695 ft: 13169 corp: 6/114b lim: 35 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:43.524 [2024-07-12 21:30:22.104121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.104147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.524 #16 NEW cov: 11695 ft: 13623 corp: 7/125b lim: 35 exec/s: 0 rss: 68Mb L: 11/28 MS: 1 EraseBytes- 00:07:43.524 [2024-07-12 21:30:22.144744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000025 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.144770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.144845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.144859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.144919] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.144932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.144992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.145005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.145064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.145077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.524 #21 NEW cov: 11695 ft: 13732 corp: 8/160b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 5 CopyPart-ChangeByte-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:43.524 [2024-07-12 21:30:22.184710] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.184736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.184795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.184808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.184852] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.184882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.184941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.184955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.524 #22 NEW cov: 11695 ft: 13782 corp: 9/188b lim: 35 exec/s: 0 rss: 68Mb L: 28/35 MS: 1 ChangeBinInt- 00:07:43.524 [2024-07-12 21:30:22.224692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.224718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.224781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.224795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.224857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.224871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.524 #23 NEW cov: 11695 ft: 13792 corp: 10/210b lim: 35 exec/s: 0 rss: 68Mb L: 22/35 MS: 1 ChangeByte- 00:07:43.524 [2024-07-12 21:30:22.264824] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.264849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.264927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.264941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.265003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.265016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.524 #24 NEW cov: 11695 ft: 13815 corp: 11/232b lim: 35 exec/s: 0 rss: 68Mb L: 22/35 MS: 1 CMP- DE: "\004\000\000\000\000\000\000\000"- 00:07:43.524 [2024-07-12 21:30:22.304970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.524 [2024-07-12 21:30:22.304996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.524 [2024-07-12 21:30:22.305062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.525 [2024-07-12 21:30:22.305077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.525 [2024-07-12 21:30:22.305141] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.525 [2024-07-12 21:30:22.305154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.784 #25 NEW cov: 11695 ft: 13824 corp: 12/259b lim: 35 exec/s: 0 rss: 68Mb L: 27/35 MS: 1 CopyPart- 00:07:43.784 [2024-07-12 21:30:22.345158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.784 [2024-07-12 21:30:22.345187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.784 [2024-07-12 21:30:22.345249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.784 [2024-07-12 21:30:22.345263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.784 [2024-07-12 21:30:22.345324] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.784 [2024-07-12 21:30:22.345337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.784 [2024-07-12 21:30:22.345395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.784 [2024-07-12 21:30:22.345408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.784 #26 NEW cov: 11695 ft: 13912 corp: 13/287b lim: 35 exec/s: 0 rss: 68Mb L: 28/35 MS: 1 ChangeByte- 00:07:43.784 [2024-07-12 21:30:22.385328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.784 [2024-07-12 21:30:22.385353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.784 [2024-07-12 21:30:22.385412] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.784 [2024-07-12 21:30:22.385426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.784 [2024-07-12 21:30:22.385488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.784 [2024-07-12 21:30:22.385503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.385558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.385572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.785 #27 NEW cov: 11695 ft: 13949 corp: 14/315b lim: 35 exec/s: 0 rss: 68Mb L: 28/35 MS: 1 ChangeBit- 00:07:43.785 [2024-07-12 21:30:22.425399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.425423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.425499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.425513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.425575] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.425589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.425658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.425671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.785 #28 NEW cov: 11695 ft: 13967 corp: 15/346b lim: 35 exec/s: 0 rss: 68Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:07:43.785 [2024-07-12 21:30:22.465390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.465414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.465490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.465505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.465562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.465576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.785 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.785 #29 NEW cov: 11718 ft: 14039 corp: 16/371b lim: 35 exec/s: 0 rss: 69Mb L: 25/35 MS: 1 EraseBytes- 00:07:43.785 [2024-07-12 21:30:22.505627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.505651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.505713] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.505726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.505783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.505797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.505854] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.505867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.785 #30 NEW cov: 11718 ft: 14062 corp: 17/400b lim: 35 exec/s: 0 rss: 69Mb L: 29/35 MS: 1 CopyPart- 00:07:43.785 [2024-07-12 21:30:22.545736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.545761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.545836] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.545850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.545910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.545923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.785 [2024-07-12 21:30:22.545981] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.785 [2024-07-12 21:30:22.545995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.785 #31 NEW cov: 11718 ft: 14067 corp: 18/428b lim: 35 exec/s: 0 rss: 69Mb L: 28/35 MS: 1 ShuffleBytes- 00:07:44.044 [2024-07-12 21:30:22.585462] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.585490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.044 #32 NEW cov: 11718 ft: 14141 corp: 19/435b lim: 35 exec/s: 32 rss: 69Mb L: 7/35 MS: 1 EraseBytes- 00:07:44.044 [2024-07-12 21:30:22.625719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.625744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.625805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.625819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.044 #33 NEW cov: 11718 ft: 14369 corp: 20/453b lim: 35 exec/s: 33 rss: 69Mb L: 18/35 MS: 1 InsertRepeatedBytes- 00:07:44.044 [2024-07-12 21:30:22.666053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.666079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.666140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.666154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.666212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.666226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.666285] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.666298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.044 #34 NEW cov: 11718 ft: 14392 corp: 21/482b lim: 35 exec/s: 34 rss: 69Mb L: 29/35 MS: 1 InsertByte- 00:07:44.044 [2024-07-12 21:30:22.706148] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.706173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.706232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.706245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.706302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.706316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.706373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005ad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.706386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.044 #35 NEW cov: 11718 ft: 14409 corp: 22/512b lim: 35 exec/s: 35 rss: 69Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:07:44.044 [2024-07-12 21:30:22.746058] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.044 [2024-07-12 21:30:22.746084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.044 [2024-07-12 21:30:22.746146] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.746159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.045 #36 NEW cov: 11718 ft: 14460 corp: 23/530b lim: 35 exec/s: 36 rss: 69Mb L: 18/35 MS: 1 ChangeBinInt- 00:07:44.045 [2024-07-12 21:30:22.786373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.786398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.045 [2024-07-12 21:30:22.786478] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.786492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.045 [2024-07-12 21:30:22.786552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.786565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.045 [2024-07-12 21:30:22.786623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.786637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.045 #41 NEW cov: 11718 ft: 14545 corp: 24/558b lim: 35 exec/s: 41 rss: 69Mb L: 28/35 MS: 5 InsertByte-ChangeByte-ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:44.045 [2024-07-12 21:30:22.826381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.826406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.045 [2024-07-12 21:30:22.826468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.826482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.045 [2024-07-12 21:30:22.826540] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.045 [2024-07-12 21:30:22.826555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.303 #42 NEW cov: 11718 ft: 14599 corp: 25/585b lim: 35 exec/s: 42 rss: 69Mb L: 27/35 MS: 1 ChangeBinInt- 00:07:44.303 [2024-07-12 21:30:22.866505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.866530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.866615] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.866628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.866687] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.866700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.303 #43 NEW cov: 11718 ft: 14615 corp: 26/606b lim: 35 exec/s: 43 rss: 69Mb L: 21/35 MS: 1 ChangeBit- 00:07:44.303 [2024-07-12 21:30:22.906721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.906749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.906815] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.906830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.906889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.906902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.906960] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005ad SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.906974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.303 #44 NEW cov: 11718 ft: 14667 corp: 27/639b lim: 35 exec/s: 44 rss: 69Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:07:44.303 [2024-07-12 21:30:22.946826] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.946850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.946911] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.946925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.946983] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.946996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.947053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.947066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.303 #45 NEW cov: 11718 ft: 14677 corp: 28/668b lim: 35 exec/s: 45 rss: 69Mb L: 29/35 MS: 1 ChangeBinInt- 00:07:44.303 [2024-07-12 21:30:22.986856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.986880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.986959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000425 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.986973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:22.987035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:22.987049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.303 #46 NEW cov: 11718 ft: 14758 corp: 29/693b lim: 35 exec/s: 46 rss: 70Mb L: 25/35 MS: 1 ChangeByte- 00:07:44.303 [2024-07-12 21:30:23.026835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:23.026859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:23.026922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000004ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:23.026936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.303 #47 NEW cov: 11718 ft: 14779 corp: 30/711b lim: 35 exec/s: 47 rss: 70Mb L: 18/35 MS: 1 CrossOver- 00:07:44.303 [2024-07-12 21:30:23.066938] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:23.066962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.303 [2024-07-12 21:30:23.067021] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.303 [2024-07-12 21:30:23.067034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.561 #48 NEW cov: 11718 ft: 14784 corp: 31/730b lim: 35 exec/s: 48 rss: 70Mb L: 19/35 MS: 1 EraseBytes- 00:07:44.561 [2024-07-12 21:30:23.107404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000025 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.107428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.561 [2024-07-12 21:30:23.107521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.107536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.561 [2024-07-12 21:30:23.107591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.107604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.561 [2024-07-12 21:30:23.107662] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.107675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.561 [2024-07-12 21:30:23.107731] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.107745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.561 #49 NEW cov: 11718 ft: 14789 corp: 32/765b lim: 35 exec/s: 49 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:44.561 [2024-07-12 21:30:23.147174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.147198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.561 [2024-07-12 21:30:23.147274] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.147288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.561 #50 NEW cov: 11718 ft: 14809 corp: 33/784b lim: 35 exec/s: 50 rss: 70Mb L: 19/35 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:07:44.561 [2024-07-12 21:30:23.187539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.561 [2024-07-12 21:30:23.187564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.561 [2024-07-12 21:30:23.187641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.187659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.187720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.187734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.187792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.187805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.562 #51 NEW cov: 11718 ft: 14810 corp: 34/813b lim: 35 exec/s: 51 rss: 70Mb L: 29/35 MS: 1 ChangeBit- 00:07:44.562 [2024-07-12 21:30:23.227543] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.227567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.227626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.227640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.227701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.227714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.562 #52 NEW cov: 11718 ft: 14825 corp: 35/836b lim: 35 exec/s: 52 rss: 70Mb L: 23/35 MS: 1 InsertByte- 00:07:44.562 [2024-07-12 21:30:23.267931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000025 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.267955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.268017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.268030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.268091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.268104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.268162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.268175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.268231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.268244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.562 #53 NEW cov: 11718 ft: 14859 corp: 36/871b lim: 35 exec/s: 53 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:44.562 [2024-07-12 21:30:23.307901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.307927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.307986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.308003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.308081] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.308095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.562 [2024-07-12 21:30:23.308154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.562 [2024-07-12 21:30:23.308167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.562 #54 NEW cov: 11718 ft: 14870 corp: 37/900b lim: 35 exec/s: 54 rss: 70Mb L: 29/35 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:07:44.820 [2024-07-12 21:30:23.347776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.820 [2024-07-12 21:30:23.347804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.820 [2024-07-12 21:30:23.347863] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.820 [2024-07-12 21:30:23.347878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.820 #55 NEW cov: 11718 ft: 14876 corp: 38/920b lim: 35 exec/s: 55 rss: 70Mb L: 20/35 MS: 1 InsertByte- 00:07:44.821 [2024-07-12 21:30:23.388102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.388128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.388188] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.388202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.388264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.388278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.388334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.388348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.821 #56 NEW cov: 11718 ft: 14883 corp: 39/951b lim: 35 exec/s: 56 rss: 70Mb L: 31/35 MS: 1 ChangeBit- 00:07:44.821 [2024-07-12 21:30:23.427855] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.427881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 #57 NEW cov: 11718 ft: 14892 corp: 40/958b lim: 35 exec/s: 57 rss: 70Mb L: 7/35 MS: 1 ChangeBit- 00:07:44.821 [2024-07-12 21:30:23.468458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000025 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.468483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.468559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.468576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.468634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.468648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.468708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.468721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.468780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.468793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.821 #58 NEW cov: 11718 ft: 14903 corp: 41/993b lim: 35 exec/s: 58 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:44.821 [2024-07-12 21:30:23.508426] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.508455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.508518] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.508532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.508596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.508609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.508669] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.508682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.821 #59 NEW cov: 11718 ft: 14911 corp: 42/1021b lim: 35 exec/s: 59 rss: 70Mb L: 28/35 MS: 1 ChangeBinInt- 00:07:44.821 [2024-07-12 21:30:23.548705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000025 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.548731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.548810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.548826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.548886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.548900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.548959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.548972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.549032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.549049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.821 #60 NEW cov: 11718 ft: 14923 corp: 43/1056b lim: 35 exec/s: 60 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\004\000\000\000\000\000\000\000"- 00:07:44.821 [2024-07-12 21:30:23.588693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.588719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.588780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.588793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.588867] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.588881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.821 [2024-07-12 21:30:23.588941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.821 [2024-07-12 21:30:23.588954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.080 #61 NEW cov: 11718 ft: 14950 corp: 44/1086b lim: 35 exec/s: 30 rss: 70Mb L: 30/35 MS: 1 InsertByte- 00:07:45.080 #61 DONE cov: 11718 ft: 14950 corp: 44/1086b lim: 35 exec/s: 30 rss: 70Mb 00:07:45.080 ###### Recommended dictionary. ###### 00:07:45.080 "\004\000\000\000\000\000\000\000" # Uses: 2 00:07:45.080 "\000\000\000\000" # Uses: 0 00:07:45.080 ###### End of recommended dictionary. ###### 00:07:45.080 Done 61 runs in 2 second(s) 00:07:45.080 21:30:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:45.080 21:30:23 -- ../common.sh@72 -- # (( i++ )) 00:07:45.080 21:30:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.080 21:30:23 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:45.080 21:30:23 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:45.080 21:30:23 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.080 21:30:23 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.080 21:30:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:45.080 21:30:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:45.080 21:30:23 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:45.080 21:30:23 -- nvmf/run.sh@29 -- # port=4416 00:07:45.080 21:30:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:45.080 21:30:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:45.080 21:30:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.080 21:30:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:45.080 [2024-07-12 21:30:23.768212] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:45.080 [2024-07-12 21:30:23.768282] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3584842 ] 00:07:45.080 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.339 [2024-07-12 21:30:23.946312] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.339 [2024-07-12 21:30:24.009054] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.339 [2024-07-12 21:30:24.009195] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.339 [2024-07-12 21:30:24.067176] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.339 [2024-07-12 21:30:24.083477] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:45.339 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.339 INFO: Seed: 3769541312 00:07:45.599 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:45.599 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:45.599 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:45.599 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.599 #2 INITED exec/s: 0 rss: 60Mb 00:07:45.599 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.599 This may also happen if the target rejected all inputs we tried so far 00:07:45.599 [2024-07-12 21:30:24.148708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.599 [2024-07-12 21:30:24.148738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.599 [2024-07-12 21:30:24.148789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.599 [2024-07-12 21:30:24.148804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.858 NEW_FUNC[1/671]: 0x497740 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:45.858 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:45.858 #4 NEW cov: 11592 ft: 11595 corp: 2/44b lim: 105 exec/s: 0 rss: 67Mb L: 43/43 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:45.859 [2024-07-12 21:30:24.469855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.469913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.859 [2024-07-12 21:30:24.470011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.470040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.859 [2024-07-12 21:30:24.470121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.470150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:45.859 [2024-07-12 21:30:24.470231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.470259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:45.859 #10 NEW cov: 11707 ft: 12672 corp: 3/145b lim: 105 exec/s: 0 rss: 67Mb L: 101/101 MS: 1 InsertRepeatedBytes- 00:07:45.859 [2024-07-12 21:30:24.519547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.519577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.859 [2024-07-12 21:30:24.519647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.519663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.859 #15 NEW cov: 11713 ft: 13170 corp: 4/189b lim: 105 exec/s: 0 rss: 67Mb L: 44/101 MS: 5 InsertByte-ChangeBit-EraseBytes-ChangeBit-CrossOver- 00:07:45.859 [2024-07-12 21:30:24.559661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.559688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.859 [2024-07-12 21:30:24.559745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.559760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.859 #16 NEW cov: 11798 ft: 13409 corp: 5/233b lim: 105 exec/s: 0 rss: 67Mb L: 44/101 MS: 1 CrossOver- 00:07:45.859 [2024-07-12 21:30:24.599617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.599643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.859 #17 NEW cov: 11798 ft: 13985 corp: 6/257b lim: 105 exec/s: 0 rss: 67Mb L: 24/101 MS: 1 EraseBytes- 00:07:45.859 [2024-07-12 21:30:24.640071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6365935212178331736 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.640098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:45.859 [2024-07-12 21:30:24.640139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.640153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:45.859 [2024-07-12 21:30:24.640206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.859 [2024-07-12 21:30:24.640222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.119 #22 NEW cov: 11798 ft: 14310 corp: 7/321b lim: 105 exec/s: 0 rss: 67Mb L: 64/101 MS: 5 CopyPart-ShuffleBytes-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:07:46.119 [2024-07-12 21:30:24.680216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.680242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.680290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.680305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.680356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.680372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.680424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.680440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.119 #23 NEW cov: 11798 ft: 14378 corp: 8/424b lim: 105 exec/s: 0 rss: 67Mb L: 103/103 MS: 1 CopyPart- 00:07:46.119 [2024-07-12 21:30:24.720093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.720118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.720155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.720171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.119 #24 NEW cov: 11798 ft: 14419 corp: 9/467b lim: 105 exec/s: 0 rss: 67Mb L: 43/103 MS: 1 ShuffleBytes- 00:07:46.119 [2024-07-12 21:30:24.760218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.760243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.760281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.760296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.119 #25 NEW cov: 11798 ft: 14429 corp: 10/512b lim: 105 exec/s: 0 rss: 67Mb L: 45/103 MS: 1 InsertByte- 00:07:46.119 [2024-07-12 21:30:24.790595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.790621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.790678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.790691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.790745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.790759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.790814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.790829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.119 #26 NEW cov: 11798 ft: 14472 corp: 11/613b lim: 105 exec/s: 0 rss: 67Mb L: 101/103 MS: 1 ShuffleBytes- 00:07:46.119 [2024-07-12 21:30:24.830303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.830328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.119 #27 NEW cov: 11798 ft: 14551 corp: 12/654b lim: 105 exec/s: 0 rss: 68Mb L: 41/103 MS: 1 EraseBytes- 00:07:46.119 [2024-07-12 21:30:24.870578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.870616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.119 [2024-07-12 21:30:24.870684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709496063 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.119 [2024-07-12 21:30:24.870699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.119 #28 NEW cov: 11798 ft: 14575 corp: 13/698b lim: 105 exec/s: 0 rss: 68Mb L: 44/103 MS: 1 ChangeByte- 00:07:46.378 [2024-07-12 21:30:24.910779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.910805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:24.910858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.910874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:24.910930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.910946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.378 #29 NEW cov: 11798 ft: 14591 corp: 14/762b lim: 105 exec/s: 0 rss: 68Mb L: 64/103 MS: 1 CrossOver- 00:07:46.378 [2024-07-12 21:30:24.950912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.950939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:24.950974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.950989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:24.951045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.951061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.378 #30 NEW cov: 11798 ft: 14633 corp: 15/826b lim: 105 exec/s: 0 rss: 68Mb L: 64/103 MS: 1 ShuffleBytes- 00:07:46.378 [2024-07-12 21:30:24.990989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.991016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:24.991066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18302628885616918528 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:24.991082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.378 #31 NEW cov: 11798 ft: 14673 corp: 16/869b lim: 105 exec/s: 0 rss: 68Mb L: 43/103 MS: 1 ChangeBinInt- 00:07:46.378 [2024-07-12 21:30:25.030992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.031019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:25.031053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.031069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.378 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.378 #32 NEW cov: 11821 ft: 14757 corp: 17/912b lim: 105 exec/s: 0 rss: 68Mb L: 43/103 MS: 1 CrossOver- 00:07:46.378 [2024-07-12 21:30:25.071126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070472207103 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.071153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:25.071195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709496063 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.071210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.378 #33 NEW cov: 11821 ft: 14791 corp: 18/956b lim: 105 exec/s: 0 rss: 68Mb L: 44/103 MS: 1 ChangeBinInt- 00:07:46.378 [2024-07-12 21:30:25.111359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.111386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:25.111427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.111446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:25.111500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446523071872368639 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.111515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.378 #34 NEW cov: 11821 ft: 14809 corp: 19/1020b lim: 105 exec/s: 34 rss: 68Mb L: 64/103 MS: 1 ChangeByte- 00:07:46.378 [2024-07-12 21:30:25.151540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.151566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:25.151602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.151616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.378 [2024-07-12 21:30:25.151667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.378 [2024-07-12 21:30:25.151682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.637 #35 NEW cov: 11821 ft: 14817 corp: 20/1099b lim: 105 exec/s: 35 rss: 68Mb L: 79/103 MS: 1 InsertRepeatedBytes- 00:07:46.637 [2024-07-12 21:30:25.191564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6365935212178309208 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.191591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.191654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.191670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.191725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.191740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.637 #36 NEW cov: 11821 ft: 14838 corp: 21/1164b lim: 105 exec/s: 36 rss: 68Mb L: 65/103 MS: 1 InsertByte- 00:07:46.637 [2024-07-12 21:30:25.231718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6365935212178309208 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.231745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.231784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.231799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.231852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.231867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.637 #37 NEW cov: 11821 ft: 14957 corp: 22/1230b lim: 105 exec/s: 37 rss: 68Mb L: 66/103 MS: 1 InsertByte- 00:07:46.637 [2024-07-12 21:30:25.271699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070472207103 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.271725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.271774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709496063 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.271790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.637 #38 NEW cov: 11821 ft: 14987 corp: 23/1282b lim: 105 exec/s: 38 rss: 69Mb L: 52/103 MS: 1 CMP- DE: "\001(\375W\271:\016\224"- 00:07:46.637 [2024-07-12 21:30:25.311794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.311821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.311873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709496063 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.311888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.637 #39 NEW cov: 11821 ft: 15027 corp: 24/1333b lim: 105 exec/s: 39 rss: 69Mb L: 51/103 MS: 1 CopyPart- 00:07:46.637 [2024-07-12 21:30:25.352113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.352140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.352204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.637 [2024-07-12 21:30:25.352219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.637 [2024-07-12 21:30:25.352274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.638 [2024-07-12 21:30:25.352289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.638 [2024-07-12 21:30:25.352344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:297 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.638 [2024-07-12 21:30:25.352361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:46.638 #40 NEW cov: 11821 ft: 15031 corp: 25/1436b lim: 105 exec/s: 40 rss: 69Mb L: 103/103 MS: 1 PersAutoDict- DE: "\001(\375W\271:\016\224"- 00:07:46.638 [2024-07-12 21:30:25.391973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744070472207103 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.638 [2024-07-12 21:30:25.392000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.638 [2024-07-12 21:30:25.392053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709496063 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.638 [2024-07-12 21:30:25.392072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.638 #41 NEW cov: 11821 ft: 15066 corp: 26/1489b lim: 105 exec/s: 41 rss: 69Mb L: 53/103 MS: 1 InsertByte- 00:07:46.897 [2024-07-12 21:30:25.432126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.432154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.432193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1085367510196233743 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.432208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.897 #42 NEW cov: 11821 ft: 15083 corp: 27/1536b lim: 105 exec/s: 42 rss: 69Mb L: 47/103 MS: 1 InsertRepeatedBytes- 00:07:46.897 [2024-07-12 21:30:25.472333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.472358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.472394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.472408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.472465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.472480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.897 #43 NEW cov: 11821 ft: 15090 corp: 28/1615b lim: 105 exec/s: 43 rss: 69Mb L: 79/103 MS: 1 ChangeBinInt- 00:07:46.897 [2024-07-12 21:30:25.512367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.512393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.512431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18302628885616918528 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.512451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.897 #44 NEW cov: 11821 ft: 15125 corp: 29/1658b lim: 105 exec/s: 44 rss: 69Mb L: 43/103 MS: 1 ChangeByte- 00:07:46.897 [2024-07-12 21:30:25.552476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.552501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.552563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.552578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.897 #45 NEW cov: 11821 ft: 15129 corp: 30/1701b lim: 105 exec/s: 45 rss: 69Mb L: 43/103 MS: 1 CrossOver- 00:07:46.897 [2024-07-12 21:30:25.592715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.592741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.592789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.592805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.592859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446523071872368639 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.592875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.897 #46 NEW cov: 11821 ft: 15134 corp: 31/1765b lim: 105 exec/s: 46 rss: 69Mb L: 64/103 MS: 1 ShuffleBytes- 00:07:46.897 [2024-07-12 21:30:25.632831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.632857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.632922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.632937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.632992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744069414584573 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.633007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.897 #47 NEW cov: 11821 ft: 15165 corp: 32/1848b lim: 105 exec/s: 47 rss: 69Mb L: 83/103 MS: 1 CopyPart- 00:07:46.897 [2024-07-12 21:30:25.673049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.673074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.673144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.673161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.673213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.673228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:46.897 [2024-07-12 21:30:25.673283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.897 [2024-07-12 21:30:25.673300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:47.157 #48 NEW cov: 11821 ft: 15172 corp: 33/1937b lim: 105 exec/s: 48 rss: 69Mb L: 89/103 MS: 1 InsertRepeatedBytes- 00:07:47.157 [2024-07-12 21:30:25.713028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6365935212178309208 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.713053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.157 [2024-07-12 21:30:25.713089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.713104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.157 [2024-07-12 21:30:25.713160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.713176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.157 #49 NEW cov: 11821 ft: 15191 corp: 34/2002b lim: 105 exec/s: 49 rss: 69Mb L: 65/103 MS: 1 ShuffleBytes- 00:07:47.157 [2024-07-12 21:30:25.752891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.752916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.157 #55 NEW cov: 11821 ft: 15208 corp: 35/2037b lim: 105 exec/s: 55 rss: 70Mb L: 35/103 MS: 1 EraseBytes- 00:07:47.157 [2024-07-12 21:30:25.793052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.793078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.157 #56 NEW cov: 11821 ft: 15239 corp: 36/2078b lim: 105 exec/s: 56 rss: 70Mb L: 41/103 MS: 1 ChangeBit- 00:07:47.157 [2024-07-12 21:30:25.833364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:6365935212178331736 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.833390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.157 [2024-07-12 21:30:25.833436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.833455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.157 [2024-07-12 21:30:25.833510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6365935209750747224 len:22617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.833541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.157 #57 NEW cov: 11821 ft: 15246 corp: 37/2142b lim: 105 exec/s: 57 rss: 70Mb L: 64/103 MS: 1 ShuffleBytes- 00:07:47.157 [2024-07-12 21:30:25.873466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.873491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.157 [2024-07-12 21:30:25.873539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.873554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.157 [2024-07-12 21:30:25.873614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.873629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.157 #58 NEW cov: 11821 ft: 15279 corp: 38/2214b lim: 105 exec/s: 58 rss: 70Mb L: 72/103 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:47.157 [2024-07-12 21:30:25.913357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744072619690751 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.157 [2024-07-12 21:30:25.913382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.157 #59 NEW cov: 11821 ft: 15291 corp: 39/2250b lim: 105 exec/s: 59 rss: 70Mb L: 36/103 MS: 1 EraseBytes- 00:07:47.417 [2024-07-12 21:30:25.953576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:25.953602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.417 [2024-07-12 21:30:25.953639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18302628885616918528 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:25.953654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.417 #60 NEW cov: 11821 ft: 15303 corp: 40/2303b lim: 105 exec/s: 60 rss: 70Mb L: 53/103 MS: 1 CrossOver- 00:07:47.417 [2024-07-12 21:30:25.993685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:25.993711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.417 [2024-07-12 21:30:25.993746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:25.993761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.417 #61 NEW cov: 11821 ft: 15305 corp: 41/2346b lim: 105 exec/s: 61 rss: 70Mb L: 43/103 MS: 1 CrossOver- 00:07:47.417 [2024-07-12 21:30:26.033975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:26.034001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.417 [2024-07-12 21:30:26.034049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744072522173962 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:26.034065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.417 [2024-07-12 21:30:26.034119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:13835058055282163711 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:26.034133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:47.417 #62 NEW cov: 11821 ft: 15311 corp: 42/2410b lim: 105 exec/s: 62 rss: 70Mb L: 64/103 MS: 1 CrossOver- 00:07:47.417 [2024-07-12 21:30:26.073928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:26.073954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.417 [2024-07-12 21:30:26.074001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:26.074016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.417 #63 NEW cov: 11821 ft: 15330 corp: 43/2453b lim: 105 exec/s: 63 rss: 70Mb L: 43/103 MS: 1 ChangeBinInt- 00:07:47.417 [2024-07-12 21:30:26.114055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:26.114081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:47.417 [2024-07-12 21:30:26.114151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13238251629368031159 len:47032 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.417 [2024-07-12 21:30:26.114166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:47.417 #64 pulse cov: 11821 ft: 15334 corp: 43/2453b lim: 105 exec/s: 32 rss: 70Mb 00:07:47.417 #64 NEW cov: 11821 ft: 15334 corp: 44/2509b lim: 105 exec/s: 32 rss: 70Mb L: 56/103 MS: 1 InsertRepeatedBytes- 00:07:47.417 #64 DONE cov: 11821 ft: 15334 corp: 44/2509b lim: 105 exec/s: 32 rss: 70Mb 00:07:47.417 ###### Recommended dictionary. ###### 00:07:47.417 "\001(\375W\271:\016\224" # Uses: 1 00:07:47.417 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:47.417 ###### End of recommended dictionary. ###### 00:07:47.417 Done 64 runs in 2 second(s) 00:07:47.676 21:30:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:47.676 21:30:26 -- ../common.sh@72 -- # (( i++ )) 00:07:47.676 21:30:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.676 21:30:26 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:47.676 21:30:26 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:47.676 21:30:26 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.676 21:30:26 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.676 21:30:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:47.676 21:30:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:47.676 21:30:26 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:47.676 21:30:26 -- nvmf/run.sh@29 -- # port=4417 00:07:47.676 21:30:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:47.677 21:30:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:47.677 21:30:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.677 21:30:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:47.677 [2024-07-12 21:30:26.294550] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:47.677 [2024-07-12 21:30:26.294617] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585204 ] 00:07:47.677 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.935 [2024-07-12 21:30:26.468612] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.935 [2024-07-12 21:30:26.533459] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.935 [2024-07-12 21:30:26.533584] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.935 [2024-07-12 21:30:26.591630] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.935 [2024-07-12 21:30:26.607917] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:47.935 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.935 INFO: Seed: 1998592942 00:07:47.935 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:47.935 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:47.935 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:47.935 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.935 #2 INITED exec/s: 0 rss: 60Mb 00:07:47.935 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.935 This may also happen if the target rejected all inputs we tried so far 00:07:47.935 [2024-07-12 21:30:26.673767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.935 [2024-07-12 21:30:26.673807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.501 NEW_FUNC[1/672]: 0x49aa30 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:48.501 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.501 #4 NEW cov: 11615 ft: 11616 corp: 2/38b lim: 120 exec/s: 0 rss: 67Mb L: 37/37 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:48.501 [2024-07-12 21:30:27.014898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.014955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.501 [2024-07-12 21:30:27.015085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.015115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.501 #15 NEW cov: 11728 ft: 13037 corp: 3/88b lim: 120 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:07:48.501 [2024-07-12 21:30:27.064749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.064775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.501 #18 NEW cov: 11734 ft: 13280 corp: 4/126b lim: 120 exec/s: 0 rss: 67Mb L: 38/50 MS: 3 ChangeByte-CrossOver-CrossOver- 00:07:48.501 [2024-07-12 21:30:27.105296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.105326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.501 [2024-07-12 21:30:27.105432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:190958540947456 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.105460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.501 [2024-07-12 21:30:27.105576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.105597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.501 #19 NEW cov: 11819 ft: 13896 corp: 5/220b lim: 120 exec/s: 0 rss: 67Mb L: 94/94 MS: 1 InsertRepeatedBytes- 00:07:48.501 [2024-07-12 21:30:27.154955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.154981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.501 #20 NEW cov: 11819 ft: 14116 corp: 6/257b lim: 120 exec/s: 0 rss: 67Mb L: 37/94 MS: 1 ChangeBinInt- 00:07:48.501 [2024-07-12 21:30:27.195123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.195148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.501 #26 NEW cov: 11819 ft: 14187 corp: 7/295b lim: 120 exec/s: 0 rss: 67Mb L: 38/94 MS: 1 ChangeBinInt- 00:07:48.501 [2024-07-12 21:30:27.235200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.501 [2024-07-12 21:30:27.235226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.501 #27 NEW cov: 11819 ft: 14301 corp: 8/333b lim: 120 exec/s: 0 rss: 67Mb L: 38/94 MS: 1 ChangeByte- 00:07:48.760 [2024-07-12 21:30:27.285428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.285459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.760 #28 NEW cov: 11819 ft: 14388 corp: 9/363b lim: 120 exec/s: 0 rss: 68Mb L: 30/94 MS: 1 EraseBytes- 00:07:48.760 [2024-07-12 21:30:27.325578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.325609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.760 #29 NEW cov: 11819 ft: 14408 corp: 10/400b lim: 120 exec/s: 0 rss: 68Mb L: 37/94 MS: 1 ShuffleBytes- 00:07:48.760 [2024-07-12 21:30:27.365617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.365644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.760 #30 NEW cov: 11819 ft: 14438 corp: 11/438b lim: 120 exec/s: 0 rss: 68Mb L: 38/94 MS: 1 ChangeBit- 00:07:48.760 [2024-07-12 21:30:27.405651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.405680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.760 #31 NEW cov: 11819 ft: 14494 corp: 12/468b lim: 120 exec/s: 0 rss: 68Mb L: 30/94 MS: 1 ChangeByte- 00:07:48.760 [2024-07-12 21:30:27.445885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755946817257680 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.445912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.760 #32 NEW cov: 11819 ft: 14499 corp: 13/506b lim: 120 exec/s: 0 rss: 68Mb L: 38/94 MS: 1 CrossOver- 00:07:48.760 [2024-07-12 21:30:27.486434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.486471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:48.760 [2024-07-12 21:30:27.486560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:190958540947456 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.486587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:48.760 [2024-07-12 21:30:27.486703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.486727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:48.760 #33 NEW cov: 11819 ft: 14542 corp: 14/600b lim: 120 exec/s: 0 rss: 68Mb L: 94/94 MS: 1 ChangeByte- 00:07:48.760 [2024-07-12 21:30:27.536202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.760 [2024-07-12 21:30:27.536229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.018 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.018 #34 NEW cov: 11842 ft: 14573 corp: 15/637b lim: 120 exec/s: 0 rss: 68Mb L: 37/94 MS: 1 ShuffleBytes- 00:07:49.018 [2024-07-12 21:30:27.586930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.586963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.018 [2024-07-12 21:30:27.587079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.587104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.018 [2024-07-12 21:30:27.587225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.587246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.018 #35 NEW cov: 11842 ft: 14609 corp: 16/710b lim: 120 exec/s: 0 rss: 68Mb L: 73/94 MS: 1 CopyPart- 00:07:49.018 [2024-07-12 21:30:27.626410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.626437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.018 #36 NEW cov: 11842 ft: 14615 corp: 17/748b lim: 120 exec/s: 0 rss: 68Mb L: 38/94 MS: 1 ShuffleBytes- 00:07:49.018 [2024-07-12 21:30:27.666517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.666550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.018 #37 NEW cov: 11842 ft: 14634 corp: 18/787b lim: 120 exec/s: 37 rss: 68Mb L: 39/94 MS: 1 EraseBytes- 00:07:49.018 [2024-07-12 21:30:27.706940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.706973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.018 [2024-07-12 21:30:27.707090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.707110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.018 #38 NEW cov: 11842 ft: 14665 corp: 19/857b lim: 120 exec/s: 38 rss: 69Mb L: 70/94 MS: 1 EraseBytes- 00:07:49.018 [2024-07-12 21:30:27.747054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.747086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.018 [2024-07-12 21:30:27.747193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:190958540947456 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.747218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.018 [2024-07-12 21:30:27.747337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.018 [2024-07-12 21:30:27.747359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.018 #39 NEW cov: 11842 ft: 14693 corp: 20/951b lim: 120 exec/s: 39 rss: 69Mb L: 94/94 MS: 1 ShuffleBytes- 00:07:49.018 [2024-07-12 21:30:27.787301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.019 [2024-07-12 21:30:27.787334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.019 [2024-07-12 21:30:27.787455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:190958540947456 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.019 [2024-07-12 21:30:27.787473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.019 [2024-07-12 21:30:27.787607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.019 [2024-07-12 21:30:27.787630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.276 #40 NEW cov: 11842 ft: 14709 corp: 21/1045b lim: 120 exec/s: 40 rss: 69Mb L: 94/94 MS: 1 ChangeBinInt- 00:07:49.276 [2024-07-12 21:30:27.826999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:65320 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:27.827032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.276 #41 NEW cov: 11842 ft: 14762 corp: 22/1075b lim: 120 exec/s: 41 rss: 69Mb L: 30/94 MS: 1 CMP- DE: "\377'\375X\304\317\"\306"- 00:07:49.276 [2024-07-12 21:30:27.867083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:27.867114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.276 #42 NEW cov: 11842 ft: 14787 corp: 23/1113b lim: 120 exec/s: 42 rss: 69Mb L: 38/94 MS: 1 ChangeBit- 00:07:49.276 [2024-07-12 21:30:27.907176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:769 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:27.907200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.276 #48 NEW cov: 11842 ft: 14797 corp: 24/1151b lim: 120 exec/s: 48 rss: 69Mb L: 38/94 MS: 1 ChangeBinInt- 00:07:49.276 [2024-07-12 21:30:27.937812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:27.937842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.276 [2024-07-12 21:30:27.937929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:27.937949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.276 [2024-07-12 21:30:27.938062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:27.938084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.276 #49 NEW cov: 11842 ft: 14813 corp: 25/1224b lim: 120 exec/s: 49 rss: 69Mb L: 73/94 MS: 1 ShuffleBytes- 00:07:49.276 [2024-07-12 21:30:27.977382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755946817257680 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:27.977409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.276 #50 NEW cov: 11842 ft: 14839 corp: 26/1262b lim: 120 exec/s: 50 rss: 69Mb L: 38/94 MS: 1 ShuffleBytes- 00:07:49.276 [2024-07-12 21:30:28.027670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.276 [2024-07-12 21:30:28.027696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.276 #51 NEW cov: 11842 ft: 14847 corp: 27/1309b lim: 120 exec/s: 51 rss: 69Mb L: 47/94 MS: 1 CopyPart- 00:07:49.534 [2024-07-12 21:30:28.067941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.067972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.534 [2024-07-12 21:30:28.068080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.068117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.534 #52 NEW cov: 11842 ft: 14851 corp: 28/1373b lim: 120 exec/s: 52 rss: 69Mb L: 64/94 MS: 1 InsertRepeatedBytes- 00:07:49.534 [2024-07-12 21:30:28.107796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:209 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.107820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.534 [2024-07-12 21:30:28.137851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:209 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.137881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.534 #54 NEW cov: 11842 ft: 14878 corp: 29/1411b lim: 120 exec/s: 54 rss: 69Mb L: 38/94 MS: 2 CrossOver-ChangeByte- 00:07:49.534 [2024-07-12 21:30:28.178219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.178248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.534 [2024-07-12 21:30:28.178338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.178358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.534 #55 NEW cov: 11842 ft: 14885 corp: 30/1481b lim: 120 exec/s: 55 rss: 69Mb L: 70/94 MS: 1 ShuffleBytes- 00:07:49.534 [2024-07-12 21:30:28.218636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.218668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.534 [2024-07-12 21:30:28.218756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:190958540947456 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.218777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.534 [2024-07-12 21:30:28.218890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.218912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.534 #56 NEW cov: 11842 ft: 14901 corp: 31/1575b lim: 120 exec/s: 56 rss: 69Mb L: 94/94 MS: 1 ShuffleBytes- 00:07:49.534 [2024-07-12 21:30:28.258228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.258261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.534 #57 NEW cov: 11842 ft: 14915 corp: 32/1605b lim: 120 exec/s: 57 rss: 69Mb L: 30/94 MS: 1 ShuffleBytes- 00:07:49.534 [2024-07-12 21:30:28.298834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.298866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.534 [2024-07-12 21:30:28.298955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.298976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.534 [2024-07-12 21:30:28.299092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1446803456761533460 len:5141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.534 [2024-07-12 21:30:28.299111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:49.792 #58 NEW cov: 11842 ft: 14940 corp: 33/1690b lim: 120 exec/s: 58 rss: 70Mb L: 85/94 MS: 1 InsertRepeatedBytes- 00:07:49.792 [2024-07-12 21:30:28.338512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046806898014290128 len:50384 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.792 [2024-07-12 21:30:28.338545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.792 #59 NEW cov: 11842 ft: 14961 corp: 34/1728b lim: 120 exec/s: 59 rss: 70Mb L: 38/94 MS: 1 PersAutoDict- DE: "\377'\375X\304\317\"\306"- 00:07:49.792 [2024-07-12 21:30:28.378765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.792 [2024-07-12 21:30:28.378792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.792 [2024-07-12 21:30:28.419051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755946984376832 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.792 [2024-07-12 21:30:28.419082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.792 [2024-07-12 21:30:28.419184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.792 [2024-07-12 21:30:28.419205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:49.792 #61 NEW cov: 11842 ft: 15030 corp: 35/1783b lim: 120 exec/s: 61 rss: 70Mb L: 55/94 MS: 2 ChangeBinInt-CrossOver- 00:07:49.792 [2024-07-12 21:30:28.458909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.792 [2024-07-12 21:30:28.458935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.792 #62 NEW cov: 11842 ft: 15037 corp: 36/1820b lim: 120 exec/s: 62 rss: 70Mb L: 37/94 MS: 1 ChangeByte- 00:07:49.792 [2024-07-12 21:30:28.499047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.792 [2024-07-12 21:30:28.499073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.792 #63 NEW cov: 11842 ft: 15040 corp: 37/1860b lim: 120 exec/s: 63 rss: 70Mb L: 40/94 MS: 1 InsertRepeatedBytes- 00:07:49.792 [2024-07-12 21:30:28.539098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655398 len:769 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.792 [2024-07-12 21:30:28.539126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:49.792 #64 NEW cov: 11842 ft: 15057 corp: 38/1898b lim: 120 exec/s: 64 rss: 70Mb L: 38/94 MS: 1 ChangeBinInt- 00:07:50.051 [2024-07-12 21:30:28.579199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15046755950319947984 len:53457 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.051 [2024-07-12 21:30:28.579226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.051 #65 NEW cov: 11842 ft: 15067 corp: 39/1937b lim: 120 exec/s: 65 rss: 70Mb L: 39/94 MS: 1 ChangeBinInt- 00:07:50.051 [2024-07-12 21:30:28.619765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.051 [2024-07-12 21:30:28.619797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.051 [2024-07-12 21:30:28.619906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:190958540947456 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.051 [2024-07-12 21:30:28.619943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.051 [2024-07-12 21:30:28.620063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:12514849900987264429 len:44462 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.051 [2024-07-12 21:30:28.620086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.051 #66 NEW cov: 11842 ft: 15077 corp: 40/2032b lim: 120 exec/s: 66 rss: 70Mb L: 95/95 MS: 1 InsertByte- 00:07:50.051 [2024-07-12 21:30:28.659393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:655360 len:209 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.051 [2024-07-12 21:30:28.659419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.051 #67 NEW cov: 11842 ft: 15088 corp: 41/2070b lim: 120 exec/s: 33 rss: 70Mb L: 38/95 MS: 1 ChangeBit- 00:07:50.051 #67 DONE cov: 11842 ft: 15088 corp: 41/2070b lim: 120 exec/s: 33 rss: 70Mb 00:07:50.051 ###### Recommended dictionary. ###### 00:07:50.051 "\377'\375X\304\317\"\306" # Uses: 1 00:07:50.051 ###### End of recommended dictionary. ###### 00:07:50.051 Done 67 runs in 2 second(s) 00:07:50.051 21:30:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:50.051 21:30:28 -- ../common.sh@72 -- # (( i++ )) 00:07:50.051 21:30:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.051 21:30:28 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:50.051 21:30:28 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:50.051 21:30:28 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.051 21:30:28 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.051 21:30:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:50.051 21:30:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:50.051 21:30:28 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:50.051 21:30:28 -- nvmf/run.sh@29 -- # port=4418 00:07:50.051 21:30:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:50.051 21:30:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:50.051 21:30:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.051 21:30:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:50.309 [2024-07-12 21:30:28.848770] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:50.309 [2024-07-12 21:30:28.848846] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3585747 ] 00:07:50.309 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.309 [2024-07-12 21:30:29.026064] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.309 [2024-07-12 21:30:29.088500] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.309 [2024-07-12 21:30:29.088629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.568 [2024-07-12 21:30:29.146585] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.568 [2024-07-12 21:30:29.162859] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:50.568 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.568 INFO: Seed: 258627771 00:07:50.568 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:50.568 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:50.568 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:50.568 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.568 #2 INITED exec/s: 0 rss: 60Mb 00:07:50.568 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.568 This may also happen if the target rejected all inputs we tried so far 00:07:50.568 [2024-07-12 21:30:29.228312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:50.568 [2024-07-12 21:30:29.228340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.568 [2024-07-12 21:30:29.228375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:50.568 [2024-07-12 21:30:29.228389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.568 [2024-07-12 21:30:29.228439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:50.568 [2024-07-12 21:30:29.228458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.568 [2024-07-12 21:30:29.228510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:50.568 [2024-07-12 21:30:29.228524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:50.826 NEW_FUNC[1/670]: 0x49e290 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:50.826 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.826 #3 NEW cov: 11559 ft: 11553 corp: 2/93b lim: 100 exec/s: 0 rss: 67Mb L: 92/92 MS: 1 InsertRepeatedBytes- 00:07:50.826 [2024-07-12 21:30:29.559156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:50.826 [2024-07-12 21:30:29.559213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.826 [2024-07-12 21:30:29.559289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:50.826 [2024-07-12 21:30:29.559321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.826 [2024-07-12 21:30:29.559400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:50.826 [2024-07-12 21:30:29.559427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:50.826 #4 NEW cov: 11672 ft: 12507 corp: 3/160b lim: 100 exec/s: 0 rss: 67Mb L: 67/92 MS: 1 InsertRepeatedBytes- 00:07:50.826 [2024-07-12 21:30:29.599105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:50.826 [2024-07-12 21:30:29.599134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:50.826 [2024-07-12 21:30:29.599183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:50.826 [2024-07-12 21:30:29.599196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:50.826 [2024-07-12 21:30:29.599252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:50.826 [2024-07-12 21:30:29.599267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.085 #5 NEW cov: 11678 ft: 12802 corp: 4/231b lim: 100 exec/s: 0 rss: 67Mb L: 71/92 MS: 1 InsertRepeatedBytes- 00:07:51.085 [2024-07-12 21:30:29.639254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.085 [2024-07-12 21:30:29.639282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.639324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.085 [2024-07-12 21:30:29.639339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.639391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.085 [2024-07-12 21:30:29.639405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.085 #6 NEW cov: 11763 ft: 13094 corp: 5/298b lim: 100 exec/s: 0 rss: 67Mb L: 67/92 MS: 1 ChangeBinInt- 00:07:51.085 [2024-07-12 21:30:29.679333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.085 [2024-07-12 21:30:29.679358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.679396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.085 [2024-07-12 21:30:29.679411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.679486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.085 [2024-07-12 21:30:29.679500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.085 #7 NEW cov: 11763 ft: 13212 corp: 6/371b lim: 100 exec/s: 0 rss: 67Mb L: 73/92 MS: 1 CopyPart- 00:07:51.085 [2024-07-12 21:30:29.719691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.085 [2024-07-12 21:30:29.719717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.719768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.085 [2024-07-12 21:30:29.719782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.719836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.085 [2024-07-12 21:30:29.719852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.719907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.085 [2024-07-12 21:30:29.719922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.719976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:51.085 [2024-07-12 21:30:29.719991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.085 #8 NEW cov: 11763 ft: 13314 corp: 7/471b lim: 100 exec/s: 0 rss: 67Mb L: 100/100 MS: 1 CrossOver- 00:07:51.085 [2024-07-12 21:30:29.759696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.085 [2024-07-12 21:30:29.759723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.759768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.085 [2024-07-12 21:30:29.759782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.759835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.085 [2024-07-12 21:30:29.759849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.759903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.085 [2024-07-12 21:30:29.759917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.085 #9 NEW cov: 11763 ft: 13360 corp: 8/562b lim: 100 exec/s: 0 rss: 68Mb L: 91/100 MS: 1 CrossOver- 00:07:51.085 [2024-07-12 21:30:29.799954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.085 [2024-07-12 21:30:29.799981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.800030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.085 [2024-07-12 21:30:29.800045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.800098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.085 [2024-07-12 21:30:29.800112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.800166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.085 [2024-07-12 21:30:29.800181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.800239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:51.085 [2024-07-12 21:30:29.800253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.085 #10 NEW cov: 11763 ft: 13393 corp: 9/662b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 ChangeByte- 00:07:51.085 [2024-07-12 21:30:29.839953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.085 [2024-07-12 21:30:29.839979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.085 [2024-07-12 21:30:29.840036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.086 [2024-07-12 21:30:29.840054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.086 [2024-07-12 21:30:29.840106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.086 [2024-07-12 21:30:29.840122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.086 [2024-07-12 21:30:29.840174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.086 [2024-07-12 21:30:29.840186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.086 #11 NEW cov: 11763 ft: 13422 corp: 10/753b lim: 100 exec/s: 0 rss: 68Mb L: 91/100 MS: 1 CrossOver- 00:07:51.344 [2024-07-12 21:30:29.879933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.344 [2024-07-12 21:30:29.879959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:29.880012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.344 [2024-07-12 21:30:29.880027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:29.880084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.344 [2024-07-12 21:30:29.880099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.344 #12 NEW cov: 11763 ft: 13519 corp: 11/821b lim: 100 exec/s: 0 rss: 68Mb L: 68/100 MS: 1 InsertByte- 00:07:51.344 [2024-07-12 21:30:29.920189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.344 [2024-07-12 21:30:29.920215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:29.920279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.344 [2024-07-12 21:30:29.920293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:29.920344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.344 [2024-07-12 21:30:29.920359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:29.920412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.344 [2024-07-12 21:30:29.920427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.344 #13 NEW cov: 11763 ft: 13578 corp: 12/915b lim: 100 exec/s: 0 rss: 68Mb L: 94/100 MS: 1 CopyPart- 00:07:51.344 [2024-07-12 21:30:29.960104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.344 [2024-07-12 21:30:29.960130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:29.960180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.344 [2024-07-12 21:30:29.960196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.344 #14 NEW cov: 11763 ft: 13949 corp: 13/959b lim: 100 exec/s: 0 rss: 68Mb L: 44/100 MS: 1 EraseBytes- 00:07:51.344 [2024-07-12 21:30:30.000397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.344 [2024-07-12 21:30:30.000422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:30.000491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.344 [2024-07-12 21:30:30.000510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:30.000563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.344 [2024-07-12 21:30:30.000577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:30.000631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.344 [2024-07-12 21:30:30.000646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.344 #20 NEW cov: 11763 ft: 13993 corp: 14/1051b lim: 100 exec/s: 0 rss: 68Mb L: 92/100 MS: 1 InsertByte- 00:07:51.344 [2024-07-12 21:30:30.040398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.344 [2024-07-12 21:30:30.040425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:30.040465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.344 [2024-07-12 21:30:30.040480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:30.040533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.344 [2024-07-12 21:30:30.040548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.344 #21 NEW cov: 11763 ft: 14107 corp: 15/1124b lim: 100 exec/s: 0 rss: 68Mb L: 73/100 MS: 1 ChangeBinInt- 00:07:51.344 [2024-07-12 21:30:30.080420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.344 [2024-07-12 21:30:30.080447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.344 [2024-07-12 21:30:30.080499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.344 [2024-07-12 21:30:30.080513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.344 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.344 #22 NEW cov: 11786 ft: 14193 corp: 16/1171b lim: 100 exec/s: 0 rss: 69Mb L: 47/100 MS: 1 EraseBytes- 00:07:51.603 [2024-07-12 21:30:30.130836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.603 [2024-07-12 21:30:30.130864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.130901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.603 [2024-07-12 21:30:30.130914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.130958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.603 [2024-07-12 21:30:30.130972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.131043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.603 [2024-07-12 21:30:30.131057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.603 #23 NEW cov: 11786 ft: 14233 corp: 17/1269b lim: 100 exec/s: 0 rss: 69Mb L: 98/100 MS: 1 InsertRepeatedBytes- 00:07:51.603 [2024-07-12 21:30:30.170933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.603 [2024-07-12 21:30:30.170959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.171000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.603 [2024-07-12 21:30:30.171014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.171069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.603 [2024-07-12 21:30:30.171084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.171124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.603 [2024-07-12 21:30:30.171138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.603 #24 NEW cov: 11786 ft: 14270 corp: 18/1367b lim: 100 exec/s: 0 rss: 69Mb L: 98/100 MS: 1 InsertRepeatedBytes- 00:07:51.603 [2024-07-12 21:30:30.211032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.603 [2024-07-12 21:30:30.211058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.211103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.603 [2024-07-12 21:30:30.211119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.211174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.603 [2024-07-12 21:30:30.211188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.603 [2024-07-12 21:30:30.211242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.603 [2024-07-12 21:30:30.211255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.603 #25 NEW cov: 11786 ft: 14278 corp: 19/1458b lim: 100 exec/s: 25 rss: 69Mb L: 91/100 MS: 1 CMP- DE: "\000\000"- 00:07:51.604 [2024-07-12 21:30:30.251150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.604 [2024-07-12 21:30:30.251177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.251220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.604 [2024-07-12 21:30:30.251235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.251288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.604 [2024-07-12 21:30:30.251302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.251356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.604 [2024-07-12 21:30:30.251370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.604 #26 NEW cov: 11786 ft: 14360 corp: 20/1549b lim: 100 exec/s: 26 rss: 69Mb L: 91/100 MS: 1 ChangeByte- 00:07:51.604 [2024-07-12 21:30:30.291374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.604 [2024-07-12 21:30:30.291400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.291458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.604 [2024-07-12 21:30:30.291472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.291528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.604 [2024-07-12 21:30:30.291541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.291594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.604 [2024-07-12 21:30:30.291609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.291665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:51.604 [2024-07-12 21:30:30.291680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:51.604 #27 NEW cov: 11786 ft: 14385 corp: 21/1649b lim: 100 exec/s: 27 rss: 69Mb L: 100/100 MS: 1 ShuffleBytes- 00:07:51.604 [2024-07-12 21:30:30.331226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.604 [2024-07-12 21:30:30.331252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.331294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.604 [2024-07-12 21:30:30.331308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.331362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.604 [2024-07-12 21:30:30.331375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.604 #28 NEW cov: 11786 ft: 14403 corp: 22/1720b lim: 100 exec/s: 28 rss: 69Mb L: 71/100 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:51.604 [2024-07-12 21:30:30.371383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.604 [2024-07-12 21:30:30.371410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.371447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.604 [2024-07-12 21:30:30.371461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.604 [2024-07-12 21:30:30.371515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.604 [2024-07-12 21:30:30.371530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.863 #33 NEW cov: 11786 ft: 14410 corp: 23/1797b lim: 100 exec/s: 33 rss: 69Mb L: 77/100 MS: 5 CopyPart-InsertByte-InsertByte-ChangeBinInt-CrossOver- 00:07:51.863 [2024-07-12 21:30:30.411530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.863 [2024-07-12 21:30:30.411555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.411608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.863 [2024-07-12 21:30:30.411622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.411675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.863 [2024-07-12 21:30:30.411689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.863 #34 NEW cov: 11786 ft: 14442 corp: 24/1857b lim: 100 exec/s: 34 rss: 70Mb L: 60/100 MS: 1 EraseBytes- 00:07:51.863 [2024-07-12 21:30:30.451576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.863 [2024-07-12 21:30:30.451606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.451641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.863 [2024-07-12 21:30:30.451656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.451710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.863 [2024-07-12 21:30:30.451725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.863 #35 NEW cov: 11786 ft: 14456 corp: 25/1934b lim: 100 exec/s: 35 rss: 70Mb L: 77/100 MS: 1 ShuffleBytes- 00:07:51.863 [2024-07-12 21:30:30.491666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.863 [2024-07-12 21:30:30.491692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.491726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.863 [2024-07-12 21:30:30.491741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.491795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.863 [2024-07-12 21:30:30.491809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.863 #36 NEW cov: 11786 ft: 14538 corp: 26/2009b lim: 100 exec/s: 36 rss: 70Mb L: 75/100 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:51.863 [2024-07-12 21:30:30.531839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.863 [2024-07-12 21:30:30.531865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.531900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.863 [2024-07-12 21:30:30.531915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.531971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.863 [2024-07-12 21:30:30.531986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.863 #37 NEW cov: 11786 ft: 14569 corp: 27/2076b lim: 100 exec/s: 37 rss: 70Mb L: 67/100 MS: 1 ChangeBit- 00:07:51.863 [2024-07-12 21:30:30.571899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.863 [2024-07-12 21:30:30.571925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.571960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.863 [2024-07-12 21:30:30.571975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.572029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.863 [2024-07-12 21:30:30.572044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.863 #38 NEW cov: 11786 ft: 14583 corp: 28/2147b lim: 100 exec/s: 38 rss: 70Mb L: 71/100 MS: 1 CopyPart- 00:07:51.863 [2024-07-12 21:30:30.612161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:51.863 [2024-07-12 21:30:30.612186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.612226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:51.863 [2024-07-12 21:30:30.612241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.612295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:51.863 [2024-07-12 21:30:30.612310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:51.863 [2024-07-12 21:30:30.612362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:51.863 [2024-07-12 21:30:30.612376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:51.863 #39 NEW cov: 11786 ft: 14613 corp: 29/2245b lim: 100 exec/s: 39 rss: 70Mb L: 98/100 MS: 1 ChangeByte- 00:07:52.121 [2024-07-12 21:30:30.651892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.121 [2024-07-12 21:30:30.651918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.121 #40 NEW cov: 11786 ft: 14960 corp: 30/2266b lim: 100 exec/s: 40 rss: 70Mb L: 21/100 MS: 1 CrossOver- 00:07:52.121 [2024-07-12 21:30:30.692435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.121 [2024-07-12 21:30:30.692466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.692504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.121 [2024-07-12 21:30:30.692519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.692573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.121 [2024-07-12 21:30:30.692605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.692662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.121 [2024-07-12 21:30:30.692677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.121 #41 NEW cov: 11786 ft: 14993 corp: 31/2364b lim: 100 exec/s: 41 rss: 70Mb L: 98/100 MS: 1 ShuffleBytes- 00:07:52.121 [2024-07-12 21:30:30.732504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.121 [2024-07-12 21:30:30.732530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.732580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.121 [2024-07-12 21:30:30.732595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.732649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.121 [2024-07-12 21:30:30.732662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.732717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.121 [2024-07-12 21:30:30.732731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.121 #42 NEW cov: 11786 ft: 14999 corp: 32/2451b lim: 100 exec/s: 42 rss: 70Mb L: 87/100 MS: 1 InsertRepeatedBytes- 00:07:52.121 [2024-07-12 21:30:30.772508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.121 [2024-07-12 21:30:30.772534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.772582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.121 [2024-07-12 21:30:30.772598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.772651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.121 [2024-07-12 21:30:30.772665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.121 #43 NEW cov: 11786 ft: 15028 corp: 33/2526b lim: 100 exec/s: 43 rss: 70Mb L: 75/100 MS: 1 ChangeBit- 00:07:52.121 [2024-07-12 21:30:30.812758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.121 [2024-07-12 21:30:30.812784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.812834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.121 [2024-07-12 21:30:30.812848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.812902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.121 [2024-07-12 21:30:30.812916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.812967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.121 [2024-07-12 21:30:30.812981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.121 #44 NEW cov: 11786 ft: 15037 corp: 34/2617b lim: 100 exec/s: 44 rss: 70Mb L: 91/100 MS: 1 CopyPart- 00:07:52.121 [2024-07-12 21:30:30.852885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.121 [2024-07-12 21:30:30.852911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.852950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.121 [2024-07-12 21:30:30.852965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.121 [2024-07-12 21:30:30.853019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.121 [2024-07-12 21:30:30.853033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.122 [2024-07-12 21:30:30.853085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.122 [2024-07-12 21:30:30.853098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.122 #45 NEW cov: 11786 ft: 15050 corp: 35/2708b lim: 100 exec/s: 45 rss: 70Mb L: 91/100 MS: 1 ChangeByte- 00:07:52.122 [2024-07-12 21:30:30.892780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.122 [2024-07-12 21:30:30.892807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.122 [2024-07-12 21:30:30.892857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.122 [2024-07-12 21:30:30.892872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.380 #46 NEW cov: 11786 ft: 15091 corp: 36/2761b lim: 100 exec/s: 46 rss: 70Mb L: 53/100 MS: 1 EraseBytes- 00:07:52.380 [2024-07-12 21:30:30.933139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.380 [2024-07-12 21:30:30.933168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:30.933204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.380 [2024-07-12 21:30:30.933218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:30.933270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.380 [2024-07-12 21:30:30.933284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:30.933337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.380 [2024-07-12 21:30:30.933352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.380 #47 NEW cov: 11786 ft: 15094 corp: 37/2856b lim: 100 exec/s: 47 rss: 70Mb L: 95/100 MS: 1 InsertRepeatedBytes- 00:07:52.380 [2024-07-12 21:30:30.973270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.380 [2024-07-12 21:30:30.973295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:30.973345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.380 [2024-07-12 21:30:30.973360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:30.973432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.380 [2024-07-12 21:30:30.973450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:30.973502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.380 [2024-07-12 21:30:30.973515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.380 #48 NEW cov: 11786 ft: 15104 corp: 38/2947b lim: 100 exec/s: 48 rss: 70Mb L: 91/100 MS: 1 CMP- DE: "\016\000"- 00:07:52.380 [2024-07-12 21:30:31.013374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.380 [2024-07-12 21:30:31.013400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.013481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.380 [2024-07-12 21:30:31.013498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.013552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.380 [2024-07-12 21:30:31.013565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.013618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.380 [2024-07-12 21:30:31.013633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.380 #49 NEW cov: 11786 ft: 15140 corp: 39/3030b lim: 100 exec/s: 49 rss: 70Mb L: 83/100 MS: 1 InsertRepeatedBytes- 00:07:52.380 [2024-07-12 21:30:31.053334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.380 [2024-07-12 21:30:31.053361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.053397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.380 [2024-07-12 21:30:31.053415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.053475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.380 [2024-07-12 21:30:31.053490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.380 #50 NEW cov: 11786 ft: 15158 corp: 40/3097b lim: 100 exec/s: 50 rss: 70Mb L: 67/100 MS: 1 ChangeBinInt- 00:07:52.380 [2024-07-12 21:30:31.083471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.380 [2024-07-12 21:30:31.083498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.083551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.380 [2024-07-12 21:30:31.083566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.083621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.380 [2024-07-12 21:30:31.083636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.380 #51 NEW cov: 11786 ft: 15199 corp: 41/3164b lim: 100 exec/s: 51 rss: 70Mb L: 67/100 MS: 1 ChangeBit- 00:07:52.380 [2024-07-12 21:30:31.123680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.380 [2024-07-12 21:30:31.123705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.380 [2024-07-12 21:30:31.123756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.381 [2024-07-12 21:30:31.123770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.381 [2024-07-12 21:30:31.123824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.381 [2024-07-12 21:30:31.123839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.381 [2024-07-12 21:30:31.123893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.381 [2024-07-12 21:30:31.123906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.381 #52 NEW cov: 11786 ft: 15218 corp: 42/3255b lim: 100 exec/s: 52 rss: 70Mb L: 91/100 MS: 1 ChangeBinInt- 00:07:52.639 [2024-07-12 21:30:31.163582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.639 [2024-07-12 21:30:31.163609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.639 [2024-07-12 21:30:31.163644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.640 [2024-07-12 21:30:31.163659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.640 #53 NEW cov: 11786 ft: 15227 corp: 43/3302b lim: 100 exec/s: 53 rss: 70Mb L: 47/100 MS: 1 ChangeBit- 00:07:52.640 [2024-07-12 21:30:31.204034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:52.640 [2024-07-12 21:30:31.204060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:52.640 [2024-07-12 21:30:31.204114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:52.640 [2024-07-12 21:30:31.204129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:52.640 [2024-07-12 21:30:31.204183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:52.640 [2024-07-12 21:30:31.204201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:52.640 [2024-07-12 21:30:31.204257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:52.640 [2024-07-12 21:30:31.204271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:52.640 [2024-07-12 21:30:31.204327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:52.640 [2024-07-12 21:30:31.204340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:52.640 #54 NEW cov: 11786 ft: 15257 corp: 44/3402b lim: 100 exec/s: 27 rss: 70Mb L: 100/100 MS: 1 ChangeByte- 00:07:52.640 #54 DONE cov: 11786 ft: 15257 corp: 44/3402b lim: 100 exec/s: 27 rss: 70Mb 00:07:52.640 ###### Recommended dictionary. ###### 00:07:52.640 "\000\000" # Uses: 2 00:07:52.640 "\016\000" # Uses: 0 00:07:52.640 ###### End of recommended dictionary. ###### 00:07:52.640 Done 54 runs in 2 second(s) 00:07:52.640 21:30:31 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:52.640 21:30:31 -- ../common.sh@72 -- # (( i++ )) 00:07:52.640 21:30:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.640 21:30:31 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:52.640 21:30:31 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:52.640 21:30:31 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.640 21:30:31 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.640 21:30:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:52.640 21:30:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:52.640 21:30:31 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:52.640 21:30:31 -- nvmf/run.sh@29 -- # port=4419 00:07:52.640 21:30:31 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:52.640 21:30:31 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:52.640 21:30:31 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.640 21:30:31 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:52.640 [2024-07-12 21:30:31.394790] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:52.640 [2024-07-12 21:30:31.394860] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3586177 ] 00:07:52.898 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.898 [2024-07-12 21:30:31.577117] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.898 [2024-07-12 21:30:31.642540] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.898 [2024-07-12 21:30:31.642664] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.157 [2024-07-12 21:30:31.700490] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.157 [2024-07-12 21:30:31.716769] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:53.157 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.157 INFO: Seed: 2812625114 00:07:53.157 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:53.157 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:53.157 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:53.157 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.157 #2 INITED exec/s: 0 rss: 60Mb 00:07:53.157 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.157 This may also happen if the target rejected all inputs we tried so far 00:07:53.157 [2024-07-12 21:30:31.792871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.157 [2024-07-12 21:30:31.792912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.157 [2024-07-12 21:30:31.793046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 00:07:53.157 [2024-07-12 21:30:31.793070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.415 NEW_FUNC[1/670]: 0x4a1250 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:53.415 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.415 #4 NEW cov: 11537 ft: 11538 corp: 2/27b lim: 50 exec/s: 0 rss: 67Mb L: 26/26 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:53.415 [2024-07-12 21:30:32.123846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:53.415 [2024-07-12 21:30:32.123884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.415 [2024-07-12 21:30:32.124023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:53.415 [2024-07-12 21:30:32.124045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.415 [2024-07-12 21:30:32.124173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:15115 00:07:53.415 [2024-07-12 21:30:32.124194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.415 #8 NEW cov: 11650 ft: 12425 corp: 3/57b lim: 50 exec/s: 0 rss: 67Mb L: 30/30 MS: 4 CopyPart-CrossOver-ChangeByte-InsertRepeatedBytes- 00:07:53.415 [2024-07-12 21:30:32.173907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967295 len:256 00:07:53.415 [2024-07-12 21:30:32.173941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.415 [2024-07-12 21:30:32.174056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:53.415 [2024-07-12 21:30:32.174076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.415 [2024-07-12 21:30:32.174194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:53.415 [2024-07-12 21:30:32.174219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.674 #9 NEW cov: 11656 ft: 12599 corp: 4/92b lim: 50 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:53.674 [2024-07-12 21:30:32.223887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.674 [2024-07-12 21:30:32.223921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.674 [2024-07-12 21:30:32.224038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765235542538 len:2571 00:07:53.675 [2024-07-12 21:30:32.224060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.675 #10 NEW cov: 11741 ft: 12878 corp: 5/118b lim: 50 exec/s: 0 rss: 67Mb L: 26/35 MS: 1 ChangeBit- 00:07:53.675 [2024-07-12 21:30:32.274249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709488895 len:65536 00:07:53.675 [2024-07-12 21:30:32.274279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.675 [2024-07-12 21:30:32.274381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:53.675 [2024-07-12 21:30:32.274409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.675 [2024-07-12 21:30:32.274538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:15115 00:07:53.675 [2024-07-12 21:30:32.274560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.675 #11 NEW cov: 11741 ft: 12987 corp: 6/148b lim: 50 exec/s: 0 rss: 67Mb L: 30/35 MS: 1 CrossOver- 00:07:53.675 [2024-07-12 21:30:32.324221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.675 [2024-07-12 21:30:32.324252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.675 [2024-07-12 21:30:32.324376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9968165863465552394 len:2571 00:07:53.675 [2024-07-12 21:30:32.324398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.675 #12 NEW cov: 11741 ft: 13063 corp: 7/174b lim: 50 exec/s: 0 rss: 67Mb L: 26/35 MS: 1 ChangeByte- 00:07:53.675 [2024-07-12 21:30:32.384476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.675 [2024-07-12 21:30:32.384515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.675 [2024-07-12 21:30:32.384645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765235542538 len:2571 00:07:53.675 [2024-07-12 21:30:32.384667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.675 #13 NEW cov: 11741 ft: 13103 corp: 8/200b lim: 50 exec/s: 0 rss: 67Mb L: 26/35 MS: 1 ChangeByte- 00:07:53.675 [2024-07-12 21:30:32.434574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.675 [2024-07-12 21:30:32.434606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.675 [2024-07-12 21:30:32.434726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765236328970 len:2571 00:07:53.675 [2024-07-12 21:30:32.434748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.934 #14 NEW cov: 11741 ft: 13153 corp: 9/226b lim: 50 exec/s: 0 rss: 67Mb L: 26/35 MS: 1 ChangeByte- 00:07:53.934 [2024-07-12 21:30:32.484780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.934 [2024-07-12 21:30:32.484806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.484939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765235542538 len:2571 00:07:53.934 [2024-07-12 21:30:32.484965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.934 #15 NEW cov: 11741 ft: 13187 corp: 10/252b lim: 50 exec/s: 0 rss: 67Mb L: 26/35 MS: 1 ShuffleBytes- 00:07:53.934 [2024-07-12 21:30:32.534908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.934 [2024-07-12 21:30:32.534938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.535058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:759430526221814282 len:2571 00:07:53.934 [2024-07-12 21:30:32.535084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.934 #16 NEW cov: 11741 ft: 13228 corp: 11/279b lim: 50 exec/s: 0 rss: 67Mb L: 27/35 MS: 1 InsertByte- 00:07:53.934 [2024-07-12 21:30:32.585195] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.934 [2024-07-12 21:30:32.585228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.585320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765235542538 len:2571 00:07:53.934 [2024-07-12 21:30:32.585340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.585466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 00:07:53.934 [2024-07-12 21:30:32.585488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.934 #17 NEW cov: 11741 ft: 13251 corp: 12/310b lim: 50 exec/s: 0 rss: 68Mb L: 31/35 MS: 1 CopyPart- 00:07:53.934 [2024-07-12 21:30:32.635619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:07:53.934 [2024-07-12 21:30:32.635654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.635750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:53.934 [2024-07-12 21:30:32.635776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.635902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:53.934 [2024-07-12 21:30:32.635925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.636048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:07:53.934 [2024-07-12 21:30:32.636069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:53.934 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.934 #18 NEW cov: 11764 ft: 13538 corp: 13/354b lim: 50 exec/s: 0 rss: 68Mb L: 44/44 MS: 1 CopyPart- 00:07:53.934 [2024-07-12 21:30:32.685315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:53.934 [2024-07-12 21:30:32.685351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:53.934 [2024-07-12 21:30:32.685477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765235542538 len:2571 00:07:53.934 [2024-07-12 21:30:32.685499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:53.934 #19 NEW cov: 11764 ft: 13614 corp: 14/375b lim: 50 exec/s: 0 rss: 68Mb L: 21/44 MS: 1 EraseBytes- 00:07:54.193 [2024-07-12 21:30:32.735503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380774154 len:2571 00:07:54.193 [2024-07-12 21:30:32.735540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.193 [2024-07-12 21:30:32.735673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 00:07:54.193 [2024-07-12 21:30:32.735693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.193 #20 NEW cov: 11764 ft: 13650 corp: 15/401b lim: 50 exec/s: 0 rss: 68Mb L: 26/44 MS: 1 ChangeByte- 00:07:54.193 [2024-07-12 21:30:32.785735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.193 [2024-07-12 21:30:32.785766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.193 [2024-07-12 21:30:32.785886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9954373589606730250 len:2571 00:07:54.193 [2024-07-12 21:30:32.785906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.193 #21 NEW cov: 11764 ft: 13662 corp: 16/427b lim: 50 exec/s: 21 rss: 68Mb L: 26/44 MS: 1 ChangeByte- 00:07:54.193 [2024-07-12 21:30:32.846149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.193 [2024-07-12 21:30:32.846182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.193 [2024-07-12 21:30:32.846284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765235542538 len:2571 00:07:54.193 [2024-07-12 21:30:32.846309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.193 [2024-07-12 21:30:32.846424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:723401728380766730 len:2571 00:07:54.193 [2024-07-12 21:30:32.846447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.193 #22 NEW cov: 11764 ft: 13683 corp: 17/458b lim: 50 exec/s: 22 rss: 68Mb L: 31/44 MS: 1 ShuffleBytes- 00:07:54.193 [2024-07-12 21:30:32.906075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:184549376 len:1 00:07:54.193 [2024-07-12 21:30:32.906101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.193 #29 NEW cov: 11764 ft: 13950 corp: 18/472b lim: 50 exec/s: 29 rss: 68Mb L: 14/44 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:54.193 [2024-07-12 21:30:32.956318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.193 [2024-07-12 21:30:32.956350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.193 [2024-07-12 21:30:32.956448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:759430525400517216 len:2571 00:07:54.193 [2024-07-12 21:30:32.956473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.452 #30 NEW cov: 11764 ft: 13990 corp: 19/499b lim: 50 exec/s: 30 rss: 68Mb L: 27/44 MS: 1 InsertByte- 00:07:54.452 [2024-07-12 21:30:33.016552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380774154 len:2571 00:07:54.452 [2024-07-12 21:30:33.016589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.452 [2024-07-12 21:30:33.016710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:723401730427587082 len:2571 00:07:54.452 [2024-07-12 21:30:33.016740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.452 #31 NEW cov: 11764 ft: 13997 corp: 20/526b lim: 50 exec/s: 31 rss: 68Mb L: 27/44 MS: 1 InsertByte- 00:07:54.452 [2024-07-12 21:30:33.076633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.452 [2024-07-12 21:30:33.076667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.452 [2024-07-12 21:30:33.076791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:759430526221816842 len:2571 00:07:54.452 [2024-07-12 21:30:33.076813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.452 #32 NEW cov: 11764 ft: 14020 corp: 21/553b lim: 50 exec/s: 32 rss: 68Mb L: 27/44 MS: 1 ChangeBinInt- 00:07:54.452 [2024-07-12 21:30:33.127077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.452 [2024-07-12 21:30:33.127108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.452 [2024-07-12 21:30:33.127217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946773765235542538 len:2611 00:07:54.452 [2024-07-12 21:30:33.127238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.452 [2024-07-12 21:30:33.127364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:734660727449192970 len:2571 00:07:54.453 [2024-07-12 21:30:33.127386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.453 #33 NEW cov: 11764 ft: 14201 corp: 22/583b lim: 50 exec/s: 33 rss: 68Mb L: 30/44 MS: 1 CopyPart- 00:07:54.453 [2024-07-12 21:30:33.176896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.453 [2024-07-12 21:30:33.176926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.453 [2024-07-12 21:30:33.177064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:759430525400124000 len:2571 00:07:54.453 [2024-07-12 21:30:33.177090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.453 #34 NEW cov: 11764 ft: 14209 corp: 23/610b lim: 50 exec/s: 34 rss: 68Mb L: 27/44 MS: 1 ChangeBinInt- 00:07:54.453 [2024-07-12 21:30:33.227111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:65291 00:07:54.453 [2024-07-12 21:30:33.227143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.453 [2024-07-12 21:30:33.227271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:759430525399730698 len:2571 00:07:54.453 [2024-07-12 21:30:33.227292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.712 #35 NEW cov: 11764 ft: 14216 corp: 24/637b lim: 50 exec/s: 35 rss: 69Mb L: 27/44 MS: 1 InsertByte- 00:07:54.712 [2024-07-12 21:30:33.277103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723542465869122058 len:2571 00:07:54.712 [2024-07-12 21:30:33.277135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.712 #36 NEW cov: 11764 ft: 14277 corp: 25/648b lim: 50 exec/s: 36 rss: 69Mb L: 11/44 MS: 1 CrossOver- 00:07:54.712 [2024-07-12 21:30:33.327719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967295 len:256 00:07:54.712 [2024-07-12 21:30:33.327752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.712 [2024-07-12 21:30:33.327860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551606 len:65536 00:07:54.712 [2024-07-12 21:30:33.327882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.712 [2024-07-12 21:30:33.327996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:54.712 [2024-07-12 21:30:33.328018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.712 #37 NEW cov: 11764 ft: 14289 corp: 26/683b lim: 50 exec/s: 37 rss: 69Mb L: 35/44 MS: 1 ChangeBinInt- 00:07:54.712 [2024-07-12 21:30:33.377599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:3595 00:07:54.712 [2024-07-12 21:30:33.377631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.712 [2024-07-12 21:30:33.377768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9968165863465552394 len:2571 00:07:54.712 [2024-07-12 21:30:33.377788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.712 #38 NEW cov: 11764 ft: 14315 corp: 27/709b lim: 50 exec/s: 38 rss: 69Mb L: 26/44 MS: 1 ChangeBit- 00:07:54.712 [2024-07-12 21:30:33.427857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:759430525399730698 len:2571 00:07:54.712 [2024-07-12 21:30:33.427888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.712 [2024-07-12 21:30:33.428017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 00:07:54.712 [2024-07-12 21:30:33.428038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.712 #39 NEW cov: 11764 ft: 14350 corp: 28/731b lim: 50 exec/s: 39 rss: 69Mb L: 22/44 MS: 1 EraseBytes- 00:07:54.712 [2024-07-12 21:30:33.478175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4294967295 len:256 00:07:54.712 [2024-07-12 21:30:33.478206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.712 [2024-07-12 21:30:33.478321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:54.712 [2024-07-12 21:30:33.478344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.712 [2024-07-12 21:30:33.478467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446479091407257599 len:65536 00:07:54.712 [2024-07-12 21:30:33.478503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.972 #40 NEW cov: 11764 ft: 14370 corp: 29/766b lim: 50 exec/s: 40 rss: 69Mb L: 35/44 MS: 1 ChangeByte- 00:07:54.972 [2024-07-12 21:30:33.528596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.972 [2024-07-12 21:30:33.528629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.528736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:723408325450533386 len:24587 00:07:54.972 [2024-07-12 21:30:33.528759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.528889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:723401902326942218 len:2571 00:07:54.972 [2024-07-12 21:30:33.528909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.529029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:6920495553130926090 len:2571 00:07:54.972 [2024-07-12 21:30:33.529051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.972 #41 NEW cov: 11764 ft: 14396 corp: 30/814b lim: 50 exec/s: 41 rss: 69Mb L: 48/48 MS: 1 CopyPart- 00:07:54.972 [2024-07-12 21:30:33.578710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.972 [2024-07-12 21:30:33.578743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.578844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9946841097437841930 len:18248 00:07:54.972 [2024-07-12 21:30:33.578866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.578990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:5136152271503443783 len:18248 00:07:54.972 [2024-07-12 21:30:33.579012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.579134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:723401729408190279 len:2571 00:07:54.972 [2024-07-12 21:30:33.579155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.972 #42 NEW cov: 11764 ft: 14404 corp: 31/863b lim: 50 exec/s: 42 rss: 69Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:07:54.972 [2024-07-12 21:30:33.628413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.972 [2024-07-12 21:30:33.628451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.628586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:723401728380766730 len:2571 00:07:54.972 [2024-07-12 21:30:33.628613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.972 #43 NEW cov: 11764 ft: 14413 corp: 32/883b lim: 50 exec/s: 43 rss: 69Mb L: 20/49 MS: 1 EraseBytes- 00:07:54.972 [2024-07-12 21:30:33.679043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:54.972 [2024-07-12 21:30:33.679077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.679146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4702111233552236865 len:16706 00:07:54.972 [2024-07-12 21:30:33.679166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.679285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:723402279062946113 len:2571 00:07:54.972 [2024-07-12 21:30:33.679309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.679445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:723401728380776970 len:12811 00:07:54.972 [2024-07-12 21:30:33.679467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:54.972 #44 NEW cov: 11764 ft: 14421 corp: 33/926b lim: 50 exec/s: 44 rss: 69Mb L: 43/49 MS: 1 InsertRepeatedBytes- 00:07:54.972 [2024-07-12 21:30:33.729136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:65291 00:07:54.972 [2024-07-12 21:30:33.729166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.729280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:759430525399730698 len:2816 00:07:54.972 [2024-07-12 21:30:33.729304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.729433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:723542465869122058 len:2571 00:07:54.972 [2024-07-12 21:30:33.729460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:54.972 [2024-07-12 21:30:33.729591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:723401728380766730 len:2571 00:07:54.972 [2024-07-12 21:30:33.729614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:55.232 #45 NEW cov: 11764 ft: 14427 corp: 34/972b lim: 50 exec/s: 45 rss: 69Mb L: 46/49 MS: 1 CopyPart- 00:07:55.232 [2024-07-12 21:30:33.779055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:723401728380766730 len:2571 00:07:55.232 [2024-07-12 21:30:33.779082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.232 [2024-07-12 21:30:33.779221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1443978218515872522 len:2571 00:07:55.232 [2024-07-12 21:30:33.779246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:55.232 #46 NEW cov: 11764 ft: 14449 corp: 35/1001b lim: 50 exec/s: 23 rss: 69Mb L: 29/49 MS: 1 CopyPart- 00:07:55.232 #46 DONE cov: 11764 ft: 14449 corp: 35/1001b lim: 50 exec/s: 23 rss: 69Mb 00:07:55.232 Done 46 runs in 2 second(s) 00:07:55.232 21:30:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:55.232 21:30:33 -- ../common.sh@72 -- # (( i++ )) 00:07:55.232 21:30:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.232 21:30:33 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:55.232 21:30:33 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:55.232 21:30:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.232 21:30:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.232 21:30:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:55.232 21:30:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:55.232 21:30:33 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:55.232 21:30:33 -- nvmf/run.sh@29 -- # port=4420 00:07:55.232 21:30:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:55.232 21:30:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:55.232 21:30:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.232 21:30:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:07:55.232 [2024-07-12 21:30:33.969988] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:55.232 [2024-07-12 21:30:33.970074] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3586577 ] 00:07:55.232 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.490 [2024-07-12 21:30:34.153130] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.490 [2024-07-12 21:30:34.224230] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.490 [2024-07-12 21:30:34.224356] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.749 [2024-07-12 21:30:34.282730] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.749 [2024-07-12 21:30:34.298990] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:55.749 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.749 INFO: Seed: 1099655074 00:07:55.749 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:55.749 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:55.749 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:55.749 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.749 #2 INITED exec/s: 0 rss: 60Mb 00:07:55.749 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.749 This may also happen if the target rejected all inputs we tried so far 00:07:55.749 [2024-07-12 21:30:34.375040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:55.749 [2024-07-12 21:30:34.375076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:55.749 [2024-07-12 21:30:34.375196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:55.749 [2024-07-12 21:30:34.375219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.008 NEW_FUNC[1/670]: 0x4a2e10 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:56.008 NEW_FUNC[2/670]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.008 #6 NEW cov: 11563 ft: 11564 corp: 2/40b lim: 90 exec/s: 0 rss: 66Mb L: 39/39 MS: 4 CMP-ShuffleBytes-EraseBytes-InsertRepeatedBytes- DE: "h\000\000\000\000\000\000\000"- 00:07:56.008 [2024-07-12 21:30:34.705813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.008 [2024-07-12 21:30:34.705852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.008 [2024-07-12 21:30:34.705976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.008 [2024-07-12 21:30:34.706000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.008 NEW_FUNC[1/2]: 0xf87430 in posix_sock_read /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1497 00:07:56.008 NEW_FUNC[2/2]: 0x1df79c0 in spdk_pipe_writer_get_buffer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/pipe.c:42 00:07:56.008 #12 NEW cov: 11708 ft: 12224 corp: 3/79b lim: 90 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ChangeByte- 00:07:56.008 [2024-07-12 21:30:34.755784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.008 [2024-07-12 21:30:34.755808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.008 #13 NEW cov: 11714 ft: 13232 corp: 4/108b lim: 90 exec/s: 0 rss: 67Mb L: 29/39 MS: 1 CrossOver- 00:07:56.268 [2024-07-12 21:30:34.796404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.268 [2024-07-12 21:30:34.796435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.268 [2024-07-12 21:30:34.796563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.268 [2024-07-12 21:30:34.796583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.268 [2024-07-12 21:30:34.796700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:56.268 [2024-07-12 21:30:34.796728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.268 #14 NEW cov: 11799 ft: 13970 corp: 5/166b lim: 90 exec/s: 0 rss: 67Mb L: 58/58 MS: 1 CopyPart- 00:07:56.268 [2024-07-12 21:30:34.836229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.268 [2024-07-12 21:30:34.836263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.268 [2024-07-12 21:30:34.836384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.268 [2024-07-12 21:30:34.836404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.268 #16 NEW cov: 11799 ft: 14070 corp: 6/214b lim: 90 exec/s: 0 rss: 67Mb L: 48/58 MS: 2 PersAutoDict-CrossOver- DE: "h\000\000\000\000\000\000\000"- 00:07:56.268 [2024-07-12 21:30:34.876333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.268 [2024-07-12 21:30:34.876357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.268 [2024-07-12 21:30:34.876497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.268 [2024-07-12 21:30:34.876522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.268 #17 NEW cov: 11799 ft: 14124 corp: 7/253b lim: 90 exec/s: 0 rss: 67Mb L: 39/58 MS: 1 CrossOver- 00:07:56.268 [2024-07-12 21:30:34.916408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.268 [2024-07-12 21:30:34.916437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.268 [2024-07-12 21:30:34.916561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.268 [2024-07-12 21:30:34.916585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.268 #18 NEW cov: 11799 ft: 14193 corp: 8/299b lim: 90 exec/s: 0 rss: 67Mb L: 46/58 MS: 1 InsertRepeatedBytes- 00:07:56.268 [2024-07-12 21:30:34.956278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.268 [2024-07-12 21:30:34.956310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.268 #19 NEW cov: 11799 ft: 14286 corp: 9/329b lim: 90 exec/s: 0 rss: 67Mb L: 30/58 MS: 1 InsertByte- 00:07:56.268 [2024-07-12 21:30:34.996491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.268 [2024-07-12 21:30:34.996516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.268 #20 NEW cov: 11799 ft: 14375 corp: 10/353b lim: 90 exec/s: 0 rss: 67Mb L: 24/58 MS: 1 CrossOver- 00:07:56.268 [2024-07-12 21:30:35.036506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.268 [2024-07-12 21:30:35.036538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.528 #21 NEW cov: 11799 ft: 14418 corp: 11/380b lim: 90 exec/s: 0 rss: 67Mb L: 27/58 MS: 1 EraseBytes- 00:07:56.528 [2024-07-12 21:30:35.076704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.528 [2024-07-12 21:30:35.076736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.528 #22 NEW cov: 11799 ft: 14434 corp: 12/403b lim: 90 exec/s: 0 rss: 67Mb L: 23/58 MS: 1 EraseBytes- 00:07:56.528 [2024-07-12 21:30:35.117029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.528 [2024-07-12 21:30:35.117057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.528 [2024-07-12 21:30:35.117165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.528 [2024-07-12 21:30:35.117190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.528 #23 NEW cov: 11799 ft: 14481 corp: 13/443b lim: 90 exec/s: 0 rss: 69Mb L: 40/58 MS: 1 InsertByte- 00:07:56.528 [2024-07-12 21:30:35.156902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.528 [2024-07-12 21:30:35.156927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.528 #24 NEW cov: 11799 ft: 14567 corp: 14/474b lim: 90 exec/s: 0 rss: 69Mb L: 31/58 MS: 1 EraseBytes- 00:07:56.528 [2024-07-12 21:30:35.197055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.528 [2024-07-12 21:30:35.197080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.528 #25 NEW cov: 11799 ft: 14623 corp: 15/507b lim: 90 exec/s: 0 rss: 69Mb L: 33/58 MS: 1 CMP- DE: "\000\000\000\222"- 00:07:56.528 [2024-07-12 21:30:35.237104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.528 [2024-07-12 21:30:35.237131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.528 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:56.528 #26 NEW cov: 11822 ft: 14655 corp: 16/538b lim: 90 exec/s: 0 rss: 69Mb L: 31/58 MS: 1 CMP- DE: "\001(\375bc\004\023J"- 00:07:56.528 [2024-07-12 21:30:35.277251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.528 [2024-07-12 21:30:35.277276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.528 #27 NEW cov: 11822 ft: 14677 corp: 17/569b lim: 90 exec/s: 0 rss: 69Mb L: 31/58 MS: 1 EraseBytes- 00:07:56.787 [2024-07-12 21:30:35.317385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.787 [2024-07-12 21:30:35.317410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.787 #28 NEW cov: 11822 ft: 14691 corp: 18/593b lim: 90 exec/s: 28 rss: 69Mb L: 24/58 MS: 1 ShuffleBytes- 00:07:56.787 [2024-07-12 21:30:35.368243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.787 [2024-07-12 21:30:35.368275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.787 [2024-07-12 21:30:35.368391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.787 [2024-07-12 21:30:35.368412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.787 [2024-07-12 21:30:35.368541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:56.787 [2024-07-12 21:30:35.368564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.787 [2024-07-12 21:30:35.368678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:56.787 [2024-07-12 21:30:35.368698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.787 #29 NEW cov: 11822 ft: 15028 corp: 19/671b lim: 90 exec/s: 29 rss: 69Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:07:56.788 [2024-07-12 21:30:35.407956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.788 [2024-07-12 21:30:35.407990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.788 [2024-07-12 21:30:35.408124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.788 [2024-07-12 21:30:35.408144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.788 #30 NEW cov: 11822 ft: 15043 corp: 20/711b lim: 90 exec/s: 30 rss: 69Mb L: 40/78 MS: 1 ShuffleBytes- 00:07:56.788 [2024-07-12 21:30:35.447825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.788 [2024-07-12 21:30:35.447849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.788 #31 NEW cov: 11822 ft: 15048 corp: 21/735b lim: 90 exec/s: 31 rss: 69Mb L: 24/78 MS: 1 EraseBytes- 00:07:56.788 [2024-07-12 21:30:35.488736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.788 [2024-07-12 21:30:35.488766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.788 [2024-07-12 21:30:35.488849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.788 [2024-07-12 21:30:35.488874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.788 [2024-07-12 21:30:35.489001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:56.788 [2024-07-12 21:30:35.489024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.788 [2024-07-12 21:30:35.489151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:56.788 [2024-07-12 21:30:35.489176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:56.788 #32 NEW cov: 11822 ft: 15074 corp: 22/812b lim: 90 exec/s: 32 rss: 69Mb L: 77/78 MS: 1 InsertRepeatedBytes- 00:07:56.788 [2024-07-12 21:30:35.528035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.788 [2024-07-12 21:30:35.528061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.788 #33 NEW cov: 11822 ft: 15141 corp: 23/839b lim: 90 exec/s: 33 rss: 69Mb L: 27/78 MS: 1 EraseBytes- 00:07:56.788 [2024-07-12 21:30:35.568929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:56.788 [2024-07-12 21:30:35.568962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:56.788 [2024-07-12 21:30:35.569063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:56.788 [2024-07-12 21:30:35.569089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:56.788 [2024-07-12 21:30:35.569215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:56.788 [2024-07-12 21:30:35.569242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:56.788 [2024-07-12 21:30:35.569381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:56.788 [2024-07-12 21:30:35.569404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.047 #34 NEW cov: 11822 ft: 15152 corp: 24/918b lim: 90 exec/s: 34 rss: 69Mb L: 79/79 MS: 1 CrossOver- 00:07:57.047 [2024-07-12 21:30:35.608745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.047 [2024-07-12 21:30:35.608779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.608905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.047 [2024-07-12 21:30:35.608923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.609042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.047 [2024-07-12 21:30:35.609065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.047 #35 NEW cov: 11822 ft: 15184 corp: 25/988b lim: 90 exec/s: 35 rss: 69Mb L: 70/79 MS: 1 InsertRepeatedBytes- 00:07:57.047 [2024-07-12 21:30:35.658990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.047 [2024-07-12 21:30:35.659022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.659145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.047 [2024-07-12 21:30:35.659169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.659297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.047 [2024-07-12 21:30:35.659319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.047 #36 NEW cov: 11822 ft: 15197 corp: 26/1058b lim: 90 exec/s: 36 rss: 70Mb L: 70/79 MS: 1 ShuffleBytes- 00:07:57.047 [2024-07-12 21:30:35.708309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.047 [2024-07-12 21:30:35.708335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.047 #37 NEW cov: 11822 ft: 15218 corp: 27/1087b lim: 90 exec/s: 37 rss: 70Mb L: 29/79 MS: 1 CopyPart- 00:07:57.047 [2024-07-12 21:30:35.749353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.047 [2024-07-12 21:30:35.749385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.749514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.047 [2024-07-12 21:30:35.749543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.749677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.047 [2024-07-12 21:30:35.749698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.749811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:57.047 [2024-07-12 21:30:35.749835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.047 #38 NEW cov: 11822 ft: 15224 corp: 28/1171b lim: 90 exec/s: 38 rss: 70Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:07:57.047 [2024-07-12 21:30:35.789593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.047 [2024-07-12 21:30:35.789626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.789712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.047 [2024-07-12 21:30:35.789733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.789849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.047 [2024-07-12 21:30:35.789869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.047 [2024-07-12 21:30:35.789978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:57.047 [2024-07-12 21:30:35.790001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.047 #44 NEW cov: 11822 ft: 15237 corp: 29/1255b lim: 90 exec/s: 44 rss: 70Mb L: 84/84 MS: 1 CrossOver- 00:07:57.307 [2024-07-12 21:30:35.839735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.307 [2024-07-12 21:30:35.839770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.839875] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.307 [2024-07-12 21:30:35.839896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.840019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.307 [2024-07-12 21:30:35.840041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.840134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:57.307 [2024-07-12 21:30:35.840154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.307 #45 NEW cov: 11822 ft: 15272 corp: 30/1339b lim: 90 exec/s: 45 rss: 70Mb L: 84/84 MS: 1 ChangeBinInt- 00:07:57.307 [2024-07-12 21:30:35.889859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.307 [2024-07-12 21:30:35.889889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.889990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.307 [2024-07-12 21:30:35.890008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.890124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.307 [2024-07-12 21:30:35.890144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.890257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:57.307 [2024-07-12 21:30:35.890276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.307 #46 NEW cov: 11822 ft: 15309 corp: 31/1423b lim: 90 exec/s: 46 rss: 70Mb L: 84/84 MS: 1 ChangeBinInt- 00:07:57.307 [2024-07-12 21:30:35.929843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.307 [2024-07-12 21:30:35.929876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.929996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.307 [2024-07-12 21:30:35.930018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.930137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.307 [2024-07-12 21:30:35.930157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:35.930277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:57.307 [2024-07-12 21:30:35.930296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:57.307 #47 NEW cov: 11822 ft: 15318 corp: 32/1501b lim: 90 exec/s: 47 rss: 70Mb L: 78/84 MS: 1 CopyPart- 00:07:57.307 [2024-07-12 21:30:35.979320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.307 [2024-07-12 21:30:35.979352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.307 #48 NEW cov: 11822 ft: 15320 corp: 33/1535b lim: 90 exec/s: 48 rss: 70Mb L: 34/84 MS: 1 CrossOver- 00:07:57.307 [2024-07-12 21:30:36.029431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.307 [2024-07-12 21:30:36.029469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.307 #49 NEW cov: 11822 ft: 15334 corp: 34/1565b lim: 90 exec/s: 49 rss: 70Mb L: 30/84 MS: 1 CopyPart- 00:07:57.307 [2024-07-12 21:30:36.079807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.307 [2024-07-12 21:30:36.079837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.307 [2024-07-12 21:30:36.079973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.307 [2024-07-12 21:30:36.079991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.566 #50 NEW cov: 11822 ft: 15348 corp: 35/1605b lim: 90 exec/s: 50 rss: 70Mb L: 40/84 MS: 1 ShuffleBytes- 00:07:57.566 [2024-07-12 21:30:36.120268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.566 [2024-07-12 21:30:36.120301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.566 [2024-07-12 21:30:36.120448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.566 [2024-07-12 21:30:36.120469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.567 [2024-07-12 21:30:36.120603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.567 [2024-07-12 21:30:36.120621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.567 #51 NEW cov: 11822 ft: 15354 corp: 36/1669b lim: 90 exec/s: 51 rss: 70Mb L: 64/84 MS: 1 EraseBytes- 00:07:57.567 [2024-07-12 21:30:36.160403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.567 [2024-07-12 21:30:36.160434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.567 [2024-07-12 21:30:36.160577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.567 [2024-07-12 21:30:36.160612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.567 [2024-07-12 21:30:36.160740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:57.567 [2024-07-12 21:30:36.160757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:57.567 #52 NEW cov: 11822 ft: 15404 corp: 37/1733b lim: 90 exec/s: 52 rss: 70Mb L: 64/84 MS: 1 ChangeBit- 00:07:57.567 [2024-07-12 21:30:36.199958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.567 [2024-07-12 21:30:36.199988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.567 #53 NEW cov: 11822 ft: 15426 corp: 38/1764b lim: 90 exec/s: 53 rss: 70Mb L: 31/84 MS: 1 ChangeByte- 00:07:57.567 [2024-07-12 21:30:36.240412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.567 [2024-07-12 21:30:36.240446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.567 [2024-07-12 21:30:36.240570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.567 [2024-07-12 21:30:36.240591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.567 #54 NEW cov: 11822 ft: 15456 corp: 39/1813b lim: 90 exec/s: 54 rss: 70Mb L: 49/84 MS: 1 InsertRepeatedBytes- 00:07:57.567 [2024-07-12 21:30:36.280194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.567 [2024-07-12 21:30:36.280217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.567 #55 NEW cov: 11822 ft: 15465 corp: 40/1837b lim: 90 exec/s: 55 rss: 70Mb L: 24/84 MS: 1 InsertByte- 00:07:57.567 [2024-07-12 21:30:36.320531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.567 [2024-07-12 21:30:36.320562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.567 [2024-07-12 21:30:36.320703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.567 [2024-07-12 21:30:36.320719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.567 #56 NEW cov: 11822 ft: 15478 corp: 41/1877b lim: 90 exec/s: 56 rss: 70Mb L: 40/84 MS: 1 ChangeBinInt- 00:07:57.826 [2024-07-12 21:30:36.360683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:57.826 [2024-07-12 21:30:36.360715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:57.826 [2024-07-12 21:30:36.360846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:57.826 [2024-07-12 21:30:36.360869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:57.826 #58 NEW cov: 11822 ft: 15523 corp: 42/1929b lim: 90 exec/s: 29 rss: 70Mb L: 52/84 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:57.826 #58 DONE cov: 11822 ft: 15523 corp: 42/1929b lim: 90 exec/s: 29 rss: 70Mb 00:07:57.826 ###### Recommended dictionary. ###### 00:07:57.826 "h\000\000\000\000\000\000\000" # Uses: 2 00:07:57.826 "\000\000\000\222" # Uses: 0 00:07:57.826 "\001(\375bc\004\023J" # Uses: 0 00:07:57.826 ###### End of recommended dictionary. ###### 00:07:57.826 Done 58 runs in 2 second(s) 00:07:57.826 21:30:36 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:07:57.826 21:30:36 -- ../common.sh@72 -- # (( i++ )) 00:07:57.826 21:30:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.826 21:30:36 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:07:57.826 21:30:36 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:07:57.826 21:30:36 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.826 21:30:36 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.826 21:30:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:57.826 21:30:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:07:57.826 21:30:36 -- nvmf/run.sh@29 -- # printf %02d 21 00:07:57.826 21:30:36 -- nvmf/run.sh@29 -- # port=4421 00:07:57.826 21:30:36 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:57.826 21:30:36 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:07:57.826 21:30:36 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.827 21:30:36 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:07:57.827 [2024-07-12 21:30:36.552703] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:57.827 [2024-07-12 21:30:36.552795] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3587118 ] 00:07:57.827 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.086 [2024-07-12 21:30:36.730675] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.086 [2024-07-12 21:30:36.793254] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.086 [2024-07-12 21:30:36.793373] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.086 [2024-07-12 21:30:36.851173] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.086 [2024-07-12 21:30:36.867474] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:07:58.345 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.345 INFO: Seed: 3667639407 00:07:58.345 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:07:58.345 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:07:58.345 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:58.345 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.345 #2 INITED exec/s: 0 rss: 60Mb 00:07:58.345 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.345 This may also happen if the target rejected all inputs we tried so far 00:07:58.345 [2024-07-12 21:30:36.912620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.345 [2024-07-12 21:30:36.912649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.604 NEW_FUNC[1/672]: 0x4a6030 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:07:58.604 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.604 #16 NEW cov: 11570 ft: 11571 corp: 2/18b lim: 50 exec/s: 0 rss: 66Mb L: 17/17 MS: 4 ChangeBit-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:58.604 [2024-07-12 21:30:37.223674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.604 [2024-07-12 21:30:37.223709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.604 [2024-07-12 21:30:37.223767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.604 [2024-07-12 21:30:37.223784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.604 [2024-07-12 21:30:37.223842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.604 [2024-07-12 21:30:37.223861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.604 #17 NEW cov: 11683 ft: 12804 corp: 3/54b lim: 50 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:58.604 [2024-07-12 21:30:37.273725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.604 [2024-07-12 21:30:37.273756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.604 [2024-07-12 21:30:37.273798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.604 [2024-07-12 21:30:37.273814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.604 [2024-07-12 21:30:37.273870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.604 [2024-07-12 21:30:37.273887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.604 #28 NEW cov: 11689 ft: 13150 corp: 4/90b lim: 50 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 ChangeBinInt- 00:07:58.604 [2024-07-12 21:30:37.313568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.604 [2024-07-12 21:30:37.313596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.604 #29 NEW cov: 11774 ft: 13429 corp: 5/104b lim: 50 exec/s: 0 rss: 67Mb L: 14/36 MS: 1 CrossOver- 00:07:58.604 [2024-07-12 21:30:37.354127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.604 [2024-07-12 21:30:37.354156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.604 [2024-07-12 21:30:37.354198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.604 [2024-07-12 21:30:37.354214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.604 [2024-07-12 21:30:37.354268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.604 [2024-07-12 21:30:37.354283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.604 [2024-07-12 21:30:37.354339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:58.604 [2024-07-12 21:30:37.354353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.604 #30 NEW cov: 11774 ft: 13847 corp: 6/147b lim: 50 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:07:58.864 [2024-07-12 21:30:37.394073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.864 [2024-07-12 21:30:37.394100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.394136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.864 [2024-07-12 21:30:37.394153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.394209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.864 [2024-07-12 21:30:37.394225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.864 #31 NEW cov: 11774 ft: 13961 corp: 7/184b lim: 50 exec/s: 0 rss: 67Mb L: 37/43 MS: 1 InsertByte- 00:07:58.864 [2024-07-12 21:30:37.434334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.864 [2024-07-12 21:30:37.434367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.434405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.864 [2024-07-12 21:30:37.434421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.434476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.864 [2024-07-12 21:30:37.434492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.434547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:58.864 [2024-07-12 21:30:37.434562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.864 #33 NEW cov: 11774 ft: 14028 corp: 8/230b lim: 50 exec/s: 0 rss: 67Mb L: 46/46 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:58.864 [2024-07-12 21:30:37.474288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.864 [2024-07-12 21:30:37.474316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.474353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.864 [2024-07-12 21:30:37.474370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.474430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.864 [2024-07-12 21:30:37.474449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.864 #34 NEW cov: 11774 ft: 14136 corp: 9/266b lim: 50 exec/s: 0 rss: 67Mb L: 36/46 MS: 1 ChangeBinInt- 00:07:58.864 [2024-07-12 21:30:37.514450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.864 [2024-07-12 21:30:37.514478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.514516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.864 [2024-07-12 21:30:37.514532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.514588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.864 [2024-07-12 21:30:37.514603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.864 #35 NEW cov: 11774 ft: 14147 corp: 10/302b lim: 50 exec/s: 0 rss: 67Mb L: 36/46 MS: 1 ChangeByte- 00:07:58.864 [2024-07-12 21:30:37.554700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.864 [2024-07-12 21:30:37.554728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.554766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.864 [2024-07-12 21:30:37.554781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.554837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.864 [2024-07-12 21:30:37.554854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.554910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:58.864 [2024-07-12 21:30:37.554928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.864 #36 NEW cov: 11774 ft: 14205 corp: 11/348b lim: 50 exec/s: 0 rss: 68Mb L: 46/46 MS: 1 CopyPart- 00:07:58.864 [2024-07-12 21:30:37.594872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.864 [2024-07-12 21:30:37.594899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.594938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.864 [2024-07-12 21:30:37.594954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.595011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.864 [2024-07-12 21:30:37.595028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.595085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:58.864 [2024-07-12 21:30:37.595100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:58.864 #37 NEW cov: 11774 ft: 14235 corp: 12/394b lim: 50 exec/s: 0 rss: 68Mb L: 46/46 MS: 1 ChangeByte- 00:07:58.864 [2024-07-12 21:30:37.634801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:58.864 [2024-07-12 21:30:37.634830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.634870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:58.864 [2024-07-12 21:30:37.634886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:58.864 [2024-07-12 21:30:37.634944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:58.864 [2024-07-12 21:30:37.634961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.180 #38 NEW cov: 11774 ft: 14265 corp: 13/430b lim: 50 exec/s: 0 rss: 69Mb L: 36/46 MS: 1 ChangeByte- 00:07:59.180 [2024-07-12 21:30:37.674574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.180 [2024-07-12 21:30:37.674601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.180 #40 NEW cov: 11774 ft: 14299 corp: 14/440b lim: 50 exec/s: 0 rss: 69Mb L: 10/46 MS: 2 EraseBytes-InsertByte- 00:07:59.180 [2024-07-12 21:30:37.714690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.180 [2024-07-12 21:30:37.714718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.180 #41 NEW cov: 11774 ft: 14403 corp: 15/454b lim: 50 exec/s: 0 rss: 69Mb L: 14/46 MS: 1 ShuffleBytes- 00:07:59.180 [2024-07-12 21:30:37.755273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.180 [2024-07-12 21:30:37.755300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.755342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.180 [2024-07-12 21:30:37.755357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.755413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.180 [2024-07-12 21:30:37.755433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.755495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.180 [2024-07-12 21:30:37.755510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.180 #42 NEW cov: 11774 ft: 14418 corp: 16/502b lim: 50 exec/s: 0 rss: 69Mb L: 48/48 MS: 1 CopyPart- 00:07:59.180 [2024-07-12 21:30:37.795226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.180 [2024-07-12 21:30:37.795253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.795291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.180 [2024-07-12 21:30:37.795307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.795366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.180 [2024-07-12 21:30:37.795382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.180 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.180 #43 NEW cov: 11797 ft: 14471 corp: 17/538b lim: 50 exec/s: 0 rss: 69Mb L: 36/48 MS: 1 ChangeBit- 00:07:59.180 [2024-07-12 21:30:37.845337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.180 [2024-07-12 21:30:37.845364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.845401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.180 [2024-07-12 21:30:37.845418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.845474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.180 [2024-07-12 21:30:37.845491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.180 #44 NEW cov: 11797 ft: 14551 corp: 18/577b lim: 50 exec/s: 0 rss: 69Mb L: 39/48 MS: 1 CopyPart- 00:07:59.180 [2024-07-12 21:30:37.885226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.180 [2024-07-12 21:30:37.885254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.180 #45 NEW cov: 11797 ft: 14601 corp: 19/591b lim: 50 exec/s: 45 rss: 69Mb L: 14/48 MS: 1 ShuffleBytes- 00:07:59.180 [2024-07-12 21:30:37.925674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.180 [2024-07-12 21:30:37.925703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.925747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.180 [2024-07-12 21:30:37.925762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.180 [2024-07-12 21:30:37.925818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.180 [2024-07-12 21:30:37.925835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.478 #46 NEW cov: 11797 ft: 14671 corp: 20/627b lim: 50 exec/s: 46 rss: 69Mb L: 36/48 MS: 1 ChangeBinInt- 00:07:59.478 [2024-07-12 21:30:37.965745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:37.965772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:37.965812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.478 [2024-07-12 21:30:37.965828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:37.965886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.478 [2024-07-12 21:30:37.965902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.478 #47 NEW cov: 11797 ft: 14690 corp: 21/664b lim: 50 exec/s: 47 rss: 69Mb L: 37/48 MS: 1 CopyPart- 00:07:59.478 [2024-07-12 21:30:38.005834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:38.005861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.005899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.478 [2024-07-12 21:30:38.005913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.005966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.478 [2024-07-12 21:30:38.005982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.478 #48 NEW cov: 11797 ft: 14712 corp: 22/700b lim: 50 exec/s: 48 rss: 69Mb L: 36/48 MS: 1 ChangeBinInt- 00:07:59.478 [2024-07-12 21:30:38.045690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:38.045718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 #49 NEW cov: 11797 ft: 14803 corp: 23/714b lim: 50 exec/s: 49 rss: 69Mb L: 14/48 MS: 1 ChangeBit- 00:07:59.478 [2024-07-12 21:30:38.076050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:38.076078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.076117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.478 [2024-07-12 21:30:38.076132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.076189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.478 [2024-07-12 21:30:38.076205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.478 #50 NEW cov: 11797 ft: 14831 corp: 24/750b lim: 50 exec/s: 50 rss: 69Mb L: 36/48 MS: 1 ChangeBit- 00:07:59.478 [2024-07-12 21:30:38.116035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:38.116062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.116096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.478 [2024-07-12 21:30:38.116112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.478 #51 NEW cov: 11797 ft: 15097 corp: 25/777b lim: 50 exec/s: 51 rss: 70Mb L: 27/48 MS: 1 EraseBytes- 00:07:59.478 [2024-07-12 21:30:38.156659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:38.156690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.156732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.478 [2024-07-12 21:30:38.156747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.156802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.478 [2024-07-12 21:30:38.156817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.156873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.478 [2024-07-12 21:30:38.156889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.156944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:59.478 [2024-07-12 21:30:38.156960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:59.478 #52 NEW cov: 11797 ft: 15158 corp: 26/827b lim: 50 exec/s: 52 rss: 70Mb L: 50/50 MS: 1 CopyPart- 00:07:59.478 [2024-07-12 21:30:38.206450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:38.206477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.206514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.478 [2024-07-12 21:30:38.206530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.206587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.478 [2024-07-12 21:30:38.206602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.478 #53 NEW cov: 11797 ft: 15162 corp: 27/863b lim: 50 exec/s: 53 rss: 70Mb L: 36/50 MS: 1 ShuffleBytes- 00:07:59.478 [2024-07-12 21:30:38.246588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.478 [2024-07-12 21:30:38.246619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.246658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.478 [2024-07-12 21:30:38.246675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.478 [2024-07-12 21:30:38.246733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.478 [2024-07-12 21:30:38.246748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.738 #54 NEW cov: 11797 ft: 15173 corp: 28/899b lim: 50 exec/s: 54 rss: 70Mb L: 36/50 MS: 1 ShuffleBytes- 00:07:59.738 [2024-07-12 21:30:38.286841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.738 [2024-07-12 21:30:38.286868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.286907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.738 [2024-07-12 21:30:38.286922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.286982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.738 [2024-07-12 21:30:38.286998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.287056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.738 [2024-07-12 21:30:38.287071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.738 #55 NEW cov: 11797 ft: 15178 corp: 29/945b lim: 50 exec/s: 55 rss: 70Mb L: 46/50 MS: 1 InsertRepeatedBytes- 00:07:59.738 [2024-07-12 21:30:38.326985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.738 [2024-07-12 21:30:38.327013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.327050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.738 [2024-07-12 21:30:38.327066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.327121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.738 [2024-07-12 21:30:38.327135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.327193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.738 [2024-07-12 21:30:38.327208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.738 #58 NEW cov: 11797 ft: 15245 corp: 30/994b lim: 50 exec/s: 58 rss: 70Mb L: 49/50 MS: 3 CopyPart-CopyPart-InsertRepeatedBytes- 00:07:59.738 [2024-07-12 21:30:38.367048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.738 [2024-07-12 21:30:38.367075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.367119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.738 [2024-07-12 21:30:38.367134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.367190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.738 [2024-07-12 21:30:38.367205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.367262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.738 [2024-07-12 21:30:38.367277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.738 #59 NEW cov: 11797 ft: 15249 corp: 31/1040b lim: 50 exec/s: 59 rss: 70Mb L: 46/50 MS: 1 ChangeByte- 00:07:59.738 [2024-07-12 21:30:38.407155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.738 [2024-07-12 21:30:38.407182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.407227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.738 [2024-07-12 21:30:38.407243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.407300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.738 [2024-07-12 21:30:38.407316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.407376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.738 [2024-07-12 21:30:38.407391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.738 #60 NEW cov: 11797 ft: 15257 corp: 32/1080b lim: 50 exec/s: 60 rss: 70Mb L: 40/50 MS: 1 CMP- DE: "\000\000\002\000"- 00:07:59.738 [2024-07-12 21:30:38.447117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.738 [2024-07-12 21:30:38.447143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.447182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.738 [2024-07-12 21:30:38.447198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.738 [2024-07-12 21:30:38.447256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.738 [2024-07-12 21:30:38.447273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.738 #61 NEW cov: 11797 ft: 15258 corp: 33/1117b lim: 50 exec/s: 61 rss: 70Mb L: 37/50 MS: 1 InsertByte- 00:07:59.738 [2024-07-12 21:30:38.487418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.739 [2024-07-12 21:30:38.487447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.739 [2024-07-12 21:30:38.487487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.739 [2024-07-12 21:30:38.487503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.739 [2024-07-12 21:30:38.487561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.739 [2024-07-12 21:30:38.487577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.739 [2024-07-12 21:30:38.487633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.739 [2024-07-12 21:30:38.487648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.739 #62 NEW cov: 11797 ft: 15272 corp: 34/1165b lim: 50 exec/s: 62 rss: 70Mb L: 48/50 MS: 1 ChangeBit- 00:07:59.998 [2024-07-12 21:30:38.527096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.998 [2024-07-12 21:30:38.527123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.998 #63 NEW cov: 11797 ft: 15295 corp: 35/1179b lim: 50 exec/s: 63 rss: 70Mb L: 14/50 MS: 1 ChangeByte- 00:07:59.998 [2024-07-12 21:30:38.567703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.998 [2024-07-12 21:30:38.567730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.567768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.998 [2024-07-12 21:30:38.567786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.567841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.998 [2024-07-12 21:30:38.567857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.567914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:59.998 [2024-07-12 21:30:38.567932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:59.998 #64 NEW cov: 11797 ft: 15305 corp: 36/1228b lim: 50 exec/s: 64 rss: 70Mb L: 49/50 MS: 1 CopyPart- 00:07:59.998 [2024-07-12 21:30:38.607468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.998 [2024-07-12 21:30:38.607496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.607539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.998 [2024-07-12 21:30:38.607554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.998 #65 NEW cov: 11797 ft: 15317 corp: 37/1255b lim: 50 exec/s: 65 rss: 70Mb L: 27/50 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:07:59.998 [2024-07-12 21:30:38.647721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.998 [2024-07-12 21:30:38.647749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.647784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.998 [2024-07-12 21:30:38.647800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.647857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.998 [2024-07-12 21:30:38.647874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.998 #66 NEW cov: 11797 ft: 15324 corp: 38/1291b lim: 50 exec/s: 66 rss: 70Mb L: 36/50 MS: 1 ChangeBinInt- 00:07:59.998 [2024-07-12 21:30:38.687571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.998 [2024-07-12 21:30:38.687600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.998 #67 NEW cov: 11797 ft: 15334 corp: 39/1305b lim: 50 exec/s: 67 rss: 70Mb L: 14/50 MS: 1 ChangeByte- 00:07:59.998 [2024-07-12 21:30:38.727951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.998 [2024-07-12 21:30:38.727981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.728037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.998 [2024-07-12 21:30:38.728054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.728112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.998 [2024-07-12 21:30:38.728129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:59.998 #68 NEW cov: 11797 ft: 15344 corp: 40/1342b lim: 50 exec/s: 68 rss: 70Mb L: 37/50 MS: 1 ChangeByte- 00:07:59.998 [2024-07-12 21:30:38.768056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:59.998 [2024-07-12 21:30:38.768083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.768130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:59.998 [2024-07-12 21:30:38.768146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:59.998 [2024-07-12 21:30:38.768202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:59.998 [2024-07-12 21:30:38.768221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.258 #69 NEW cov: 11797 ft: 15349 corp: 41/1379b lim: 50 exec/s: 69 rss: 70Mb L: 37/50 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:00.258 [2024-07-12 21:30:38.808216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:00.258 [2024-07-12 21:30:38.808243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.808282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:00.258 [2024-07-12 21:30:38.808299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.808342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:00.258 [2024-07-12 21:30:38.808358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.258 #70 NEW cov: 11797 ft: 15366 corp: 42/1417b lim: 50 exec/s: 70 rss: 70Mb L: 38/50 MS: 1 InsertByte- 00:08:00.258 [2024-07-12 21:30:38.848462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:00.258 [2024-07-12 21:30:38.848490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.848528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:00.258 [2024-07-12 21:30:38.848543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.848597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:00.258 [2024-07-12 21:30:38.848613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.848669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:00.258 [2024-07-12 21:30:38.848701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.258 #71 NEW cov: 11797 ft: 15378 corp: 43/1466b lim: 50 exec/s: 71 rss: 70Mb L: 49/50 MS: 1 ChangeByte- 00:08:00.258 [2024-07-12 21:30:38.888695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:00.258 [2024-07-12 21:30:38.888722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.888763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:00.258 [2024-07-12 21:30:38.888776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.888832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:00.258 [2024-07-12 21:30:38.888848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.888907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:00.258 [2024-07-12 21:30:38.888923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:00.258 [2024-07-12 21:30:38.888981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:00.258 [2024-07-12 21:30:38.888997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:00.258 #72 NEW cov: 11797 ft: 15386 corp: 44/1516b lim: 50 exec/s: 36 rss: 70Mb L: 50/50 MS: 1 CMP- DE: "\377\377\377\001"- 00:08:00.258 #72 DONE cov: 11797 ft: 15386 corp: 44/1516b lim: 50 exec/s: 36 rss: 70Mb 00:08:00.258 ###### Recommended dictionary. ###### 00:08:00.258 "\000\000\002\000" # Uses: 2 00:08:00.258 "\377\377\377\001" # Uses: 0 00:08:00.258 ###### End of recommended dictionary. ###### 00:08:00.258 Done 72 runs in 2 second(s) 00:08:00.258 21:30:39 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:00.258 21:30:39 -- ../common.sh@72 -- # (( i++ )) 00:08:00.258 21:30:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.258 21:30:39 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:00.258 21:30:39 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:00.258 21:30:39 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.258 21:30:39 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.518 21:30:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:00.518 21:30:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:00.518 21:30:39 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:00.518 21:30:39 -- nvmf/run.sh@29 -- # port=4422 00:08:00.518 21:30:39 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:00.518 21:30:39 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:00.518 21:30:39 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.518 21:30:39 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:00.518 [2024-07-12 21:30:39.081361] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:00.518 [2024-07-12 21:30:39.081430] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3587628 ] 00:08:00.518 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.518 [2024-07-12 21:30:39.259777] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.776 [2024-07-12 21:30:39.322727] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.776 [2024-07-12 21:30:39.322869] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.776 [2024-07-12 21:30:39.380887] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.776 [2024-07-12 21:30:39.397186] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:00.776 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.776 INFO: Seed: 1902671588 00:08:00.776 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:00.776 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:00.777 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:00.777 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.777 #2 INITED exec/s: 0 rss: 60Mb 00:08:00.777 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.777 This may also happen if the target rejected all inputs we tried so far 00:08:00.777 [2024-07-12 21:30:39.442427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:00.777 [2024-07-12 21:30:39.442461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:00.777 [2024-07-12 21:30:39.442510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:00.777 [2024-07-12 21:30:39.442527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.035 NEW_FUNC[1/672]: 0x4a82f0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:01.035 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.035 #4 NEW cov: 11596 ft: 11597 corp: 2/41b lim: 85 exec/s: 0 rss: 66Mb L: 40/40 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:01.035 [2024-07-12 21:30:39.753173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.035 [2024-07-12 21:30:39.753205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.035 [2024-07-12 21:30:39.753260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.035 [2024-07-12 21:30:39.753277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.035 #5 NEW cov: 11709 ft: 12013 corp: 3/82b lim: 85 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 CrossOver- 00:08:01.035 [2024-07-12 21:30:39.793256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.035 [2024-07-12 21:30:39.793284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.035 [2024-07-12 21:30:39.793338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.035 [2024-07-12 21:30:39.793353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.035 #6 NEW cov: 11715 ft: 12239 corp: 4/122b lim: 85 exec/s: 0 rss: 67Mb L: 40/41 MS: 1 ShuffleBytes- 00:08:01.295 [2024-07-12 21:30:39.833293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.295 [2024-07-12 21:30:39.833320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:39.833369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.295 [2024-07-12 21:30:39.833385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.295 #12 NEW cov: 11800 ft: 12474 corp: 5/162b lim: 85 exec/s: 0 rss: 67Mb L: 40/41 MS: 1 CMP- DE: "\377\015"- 00:08:01.295 [2024-07-12 21:30:39.873421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.295 [2024-07-12 21:30:39.873450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:39.873489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.295 [2024-07-12 21:30:39.873507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.295 #13 NEW cov: 11800 ft: 12657 corp: 6/203b lim: 85 exec/s: 0 rss: 67Mb L: 41/41 MS: 1 InsertByte- 00:08:01.295 [2024-07-12 21:30:39.913615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.295 [2024-07-12 21:30:39.913642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:39.913691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.295 [2024-07-12 21:30:39.913707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.295 #14 NEW cov: 11800 ft: 12770 corp: 7/245b lim: 85 exec/s: 0 rss: 67Mb L: 42/42 MS: 1 PersAutoDict- DE: "\377\015"- 00:08:01.295 [2024-07-12 21:30:39.953716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.295 [2024-07-12 21:30:39.953742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:39.953801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.295 [2024-07-12 21:30:39.953817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.295 #15 NEW cov: 11800 ft: 12810 corp: 8/285b lim: 85 exec/s: 0 rss: 67Mb L: 40/42 MS: 1 ChangeBinInt- 00:08:01.295 [2024-07-12 21:30:39.983778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.295 [2024-07-12 21:30:39.983804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:39.983848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.295 [2024-07-12 21:30:39.983864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.295 #16 NEW cov: 11800 ft: 12866 corp: 9/326b lim: 85 exec/s: 0 rss: 67Mb L: 41/42 MS: 1 ShuffleBytes- 00:08:01.295 [2024-07-12 21:30:40.024233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.295 [2024-07-12 21:30:40.024259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:40.024317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.295 [2024-07-12 21:30:40.024332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:40.024387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:01.295 [2024-07-12 21:30:40.024402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.295 [2024-07-12 21:30:40.024459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:01.295 [2024-07-12 21:30:40.024476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.295 #17 NEW cov: 11800 ft: 13317 corp: 10/408b lim: 85 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 InsertRepeatedBytes- 00:08:01.555 [2024-07-12 21:30:40.084290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.555 [2024-07-12 21:30:40.084324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.084381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.555 [2024-07-12 21:30:40.084397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.084459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:01.555 [2024-07-12 21:30:40.084490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.555 #18 NEW cov: 11800 ft: 13660 corp: 11/466b lim: 85 exec/s: 0 rss: 68Mb L: 58/82 MS: 1 InsertRepeatedBytes- 00:08:01.555 [2024-07-12 21:30:40.124511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.555 [2024-07-12 21:30:40.124542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.124576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.555 [2024-07-12 21:30:40.124592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.124645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:01.555 [2024-07-12 21:30:40.124664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.124717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:01.555 [2024-07-12 21:30:40.124733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:01.555 #19 NEW cov: 11800 ft: 13771 corp: 12/536b lim: 85 exec/s: 0 rss: 68Mb L: 70/82 MS: 1 InsertRepeatedBytes- 00:08:01.555 [2024-07-12 21:30:40.164292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.555 [2024-07-12 21:30:40.164319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.164376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.555 [2024-07-12 21:30:40.164394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.555 #20 NEW cov: 11800 ft: 13794 corp: 13/577b lim: 85 exec/s: 0 rss: 68Mb L: 41/82 MS: 1 ChangeBinInt- 00:08:01.555 [2024-07-12 21:30:40.204417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.555 [2024-07-12 21:30:40.204448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.204503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.555 [2024-07-12 21:30:40.204520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.555 #21 NEW cov: 11800 ft: 13801 corp: 14/618b lim: 85 exec/s: 0 rss: 68Mb L: 41/82 MS: 1 ChangeBinInt- 00:08:01.555 [2024-07-12 21:30:40.244567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.555 [2024-07-12 21:30:40.244594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.244636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.555 [2024-07-12 21:30:40.244650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.555 #22 NEW cov: 11800 ft: 13860 corp: 15/660b lim: 85 exec/s: 0 rss: 68Mb L: 42/82 MS: 1 InsertByte- 00:08:01.555 [2024-07-12 21:30:40.284584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.555 [2024-07-12 21:30:40.284612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.555 #23 NEW cov: 11800 ft: 14680 corp: 16/683b lim: 85 exec/s: 0 rss: 68Mb L: 23/82 MS: 1 EraseBytes- 00:08:01.555 [2024-07-12 21:30:40.324786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.555 [2024-07-12 21:30:40.324813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.555 [2024-07-12 21:30:40.324863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.555 [2024-07-12 21:30:40.324879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.815 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.815 #24 NEW cov: 11823 ft: 14732 corp: 17/724b lim: 85 exec/s: 0 rss: 68Mb L: 41/82 MS: 1 ShuffleBytes- 00:08:01.815 [2024-07-12 21:30:40.364994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.815 [2024-07-12 21:30:40.365020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.365064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.815 [2024-07-12 21:30:40.365079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.365118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:01.815 [2024-07-12 21:30:40.365135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.815 #25 NEW cov: 11823 ft: 14752 corp: 18/778b lim: 85 exec/s: 0 rss: 68Mb L: 54/82 MS: 1 InsertRepeatedBytes- 00:08:01.815 [2024-07-12 21:30:40.405021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.815 [2024-07-12 21:30:40.405048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.405092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.815 [2024-07-12 21:30:40.405109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.815 #26 NEW cov: 11823 ft: 14797 corp: 19/821b lim: 85 exec/s: 0 rss: 68Mb L: 43/82 MS: 1 PersAutoDict- DE: "\377\015"- 00:08:01.815 [2024-07-12 21:30:40.445100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.815 [2024-07-12 21:30:40.445127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.445174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.815 [2024-07-12 21:30:40.445192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.815 #27 NEW cov: 11823 ft: 14819 corp: 20/864b lim: 85 exec/s: 27 rss: 68Mb L: 43/82 MS: 1 ChangeByte- 00:08:01.815 [2024-07-12 21:30:40.485235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.815 [2024-07-12 21:30:40.485263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.485305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.815 [2024-07-12 21:30:40.485320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.815 #28 NEW cov: 11823 ft: 14830 corp: 21/907b lim: 85 exec/s: 28 rss: 68Mb L: 43/82 MS: 1 PersAutoDict- DE: "\377\015"- 00:08:01.815 [2024-07-12 21:30:40.515477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.815 [2024-07-12 21:30:40.515504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.515540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.815 [2024-07-12 21:30:40.515557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.515612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:01.815 [2024-07-12 21:30:40.515628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:01.815 #29 NEW cov: 11823 ft: 14836 corp: 22/961b lim: 85 exec/s: 29 rss: 68Mb L: 54/82 MS: 1 ChangeByte- 00:08:01.815 [2024-07-12 21:30:40.555458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.815 [2024-07-12 21:30:40.555485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.555542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.815 [2024-07-12 21:30:40.555558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:01.815 #30 NEW cov: 11823 ft: 14864 corp: 23/1004b lim: 85 exec/s: 30 rss: 68Mb L: 43/82 MS: 1 InsertByte- 00:08:01.815 [2024-07-12 21:30:40.595557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:01.815 [2024-07-12 21:30:40.595584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:01.815 [2024-07-12 21:30:40.595626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:01.815 [2024-07-12 21:30:40.595641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.075 #31 NEW cov: 11823 ft: 14868 corp: 24/1045b lim: 85 exec/s: 31 rss: 68Mb L: 41/82 MS: 1 PersAutoDict- DE: "\377\015"- 00:08:02.075 [2024-07-12 21:30:40.625916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.075 [2024-07-12 21:30:40.625943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.625991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.075 [2024-07-12 21:30:40.626006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.626060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.075 [2024-07-12 21:30:40.626076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.626132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:02.075 [2024-07-12 21:30:40.626148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.075 #32 NEW cov: 11823 ft: 14885 corp: 25/1127b lim: 85 exec/s: 32 rss: 68Mb L: 82/82 MS: 1 CopyPart- 00:08:02.075 [2024-07-12 21:30:40.666082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.075 [2024-07-12 21:30:40.666109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.666153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.075 [2024-07-12 21:30:40.666168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.666223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.075 [2024-07-12 21:30:40.666239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.666293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:02.075 [2024-07-12 21:30:40.666309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.075 #33 NEW cov: 11823 ft: 14900 corp: 26/1207b lim: 85 exec/s: 33 rss: 68Mb L: 80/82 MS: 1 InsertRepeatedBytes- 00:08:02.075 [2024-07-12 21:30:40.705942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.075 [2024-07-12 21:30:40.705970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.706031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.075 [2024-07-12 21:30:40.706048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.075 #34 NEW cov: 11823 ft: 14908 corp: 27/1247b lim: 85 exec/s: 34 rss: 68Mb L: 40/82 MS: 1 EraseBytes- 00:08:02.075 [2024-07-12 21:30:40.736193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.075 [2024-07-12 21:30:40.736235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.736273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.075 [2024-07-12 21:30:40.736290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.736342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.075 [2024-07-12 21:30:40.736358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.075 #35 NEW cov: 11823 ft: 14969 corp: 28/1301b lim: 85 exec/s: 35 rss: 69Mb L: 54/82 MS: 1 PersAutoDict- DE: "\377\015"- 00:08:02.075 [2024-07-12 21:30:40.776282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.075 [2024-07-12 21:30:40.776308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.776349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.075 [2024-07-12 21:30:40.776365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.776419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.075 [2024-07-12 21:30:40.776435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.075 #36 NEW cov: 11823 ft: 15022 corp: 29/1359b lim: 85 exec/s: 36 rss: 69Mb L: 58/82 MS: 1 CopyPart- 00:08:02.075 [2024-07-12 21:30:40.816096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.075 [2024-07-12 21:30:40.816123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.075 #37 NEW cov: 11823 ft: 15031 corp: 30/1384b lim: 85 exec/s: 37 rss: 69Mb L: 25/82 MS: 1 EraseBytes- 00:08:02.075 [2024-07-12 21:30:40.856381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.075 [2024-07-12 21:30:40.856408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.075 [2024-07-12 21:30:40.856460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.075 [2024-07-12 21:30:40.856478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.335 #38 NEW cov: 11823 ft: 15054 corp: 31/1426b lim: 85 exec/s: 38 rss: 69Mb L: 42/82 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:08:02.335 [2024-07-12 21:30:40.896447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.335 [2024-07-12 21:30:40.896474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.335 [2024-07-12 21:30:40.896525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.335 [2024-07-12 21:30:40.896542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.335 #39 NEW cov: 11823 ft: 15063 corp: 32/1469b lim: 85 exec/s: 39 rss: 69Mb L: 43/82 MS: 1 ChangeByte- 00:08:02.335 [2024-07-12 21:30:40.936748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.335 [2024-07-12 21:30:40.936775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.335 [2024-07-12 21:30:40.936812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.335 [2024-07-12 21:30:40.936829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.335 [2024-07-12 21:30:40.936883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.335 [2024-07-12 21:30:40.936900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.335 #40 NEW cov: 11823 ft: 15096 corp: 33/1521b lim: 85 exec/s: 40 rss: 69Mb L: 52/82 MS: 1 CrossOver- 00:08:02.335 [2024-07-12 21:30:40.976964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.335 [2024-07-12 21:30:40.976992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.335 [2024-07-12 21:30:40.977030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.335 [2024-07-12 21:30:40.977046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.335 [2024-07-12 21:30:40.977101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.335 [2024-07-12 21:30:40.977117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.335 [2024-07-12 21:30:40.977172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:02.335 [2024-07-12 21:30:40.977189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.335 #41 NEW cov: 11823 ft: 15097 corp: 34/1591b lim: 85 exec/s: 41 rss: 69Mb L: 70/82 MS: 1 CopyPart- 00:08:02.335 [2024-07-12 21:30:41.016591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.335 [2024-07-12 21:30:41.016618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.335 #42 NEW cov: 11823 ft: 15126 corp: 35/1614b lim: 85 exec/s: 42 rss: 69Mb L: 23/82 MS: 1 ChangeByte- 00:08:02.335 [2024-07-12 21:30:41.056930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.335 [2024-07-12 21:30:41.056958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.335 [2024-07-12 21:30:41.057007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.335 [2024-07-12 21:30:41.057023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.335 #43 NEW cov: 11823 ft: 15130 corp: 36/1655b lim: 85 exec/s: 43 rss: 69Mb L: 41/82 MS: 1 ChangeByte- 00:08:02.335 [2024-07-12 21:30:41.086813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.335 [2024-07-12 21:30:41.086839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.335 #44 NEW cov: 11823 ft: 15167 corp: 37/1678b lim: 85 exec/s: 44 rss: 69Mb L: 23/82 MS: 1 ShuffleBytes- 00:08:02.594 [2024-07-12 21:30:41.127119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.594 [2024-07-12 21:30:41.127145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.594 [2024-07-12 21:30:41.127203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.594 [2024-07-12 21:30:41.127219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.594 #45 NEW cov: 11823 ft: 15175 corp: 38/1720b lim: 85 exec/s: 45 rss: 69Mb L: 42/82 MS: 1 InsertByte- 00:08:02.594 [2024-07-12 21:30:41.167203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.595 [2024-07-12 21:30:41.167229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.167280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.595 [2024-07-12 21:30:41.167295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.197348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.595 [2024-07-12 21:30:41.197373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.197416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.595 [2024-07-12 21:30:41.197432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.595 #47 NEW cov: 11823 ft: 15222 corp: 39/1761b lim: 85 exec/s: 47 rss: 69Mb L: 41/82 MS: 2 ChangeBinInt-ChangeBinInt- 00:08:02.595 [2024-07-12 21:30:41.227409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.595 [2024-07-12 21:30:41.227435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.227478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.595 [2024-07-12 21:30:41.227493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.595 #48 NEW cov: 11823 ft: 15228 corp: 40/1801b lim: 85 exec/s: 48 rss: 69Mb L: 40/82 MS: 1 CopyPart- 00:08:02.595 [2024-07-12 21:30:41.257811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.595 [2024-07-12 21:30:41.257837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.257874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.595 [2024-07-12 21:30:41.257889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.257941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.595 [2024-07-12 21:30:41.257957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.258010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:02.595 [2024-07-12 21:30:41.258026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.595 #49 NEW cov: 11823 ft: 15252 corp: 41/1883b lim: 85 exec/s: 49 rss: 70Mb L: 82/82 MS: 1 ChangeBinInt- 00:08:02.595 [2024-07-12 21:30:41.297614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.595 [2024-07-12 21:30:41.297641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.297691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.595 [2024-07-12 21:30:41.297707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.595 #50 NEW cov: 11823 ft: 15290 corp: 42/1921b lim: 85 exec/s: 50 rss: 70Mb L: 38/82 MS: 1 EraseBytes- 00:08:02.595 [2024-07-12 21:30:41.337717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.595 [2024-07-12 21:30:41.337744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.337781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.595 [2024-07-12 21:30:41.337796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.595 #51 NEW cov: 11823 ft: 15304 corp: 43/1960b lim: 85 exec/s: 51 rss: 70Mb L: 39/82 MS: 1 EraseBytes- 00:08:02.595 [2024-07-12 21:30:41.367781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.595 [2024-07-12 21:30:41.367808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.595 [2024-07-12 21:30:41.367851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.595 [2024-07-12 21:30:41.367867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.855 #52 NEW cov: 11823 ft: 15308 corp: 44/2001b lim: 85 exec/s: 52 rss: 70Mb L: 41/82 MS: 1 InsertByte- 00:08:02.855 [2024-07-12 21:30:41.398168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.855 [2024-07-12 21:30:41.398194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.855 [2024-07-12 21:30:41.398241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.855 [2024-07-12 21:30:41.398257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.855 [2024-07-12 21:30:41.398308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:02.855 [2024-07-12 21:30:41.398324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:02.855 [2024-07-12 21:30:41.398379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:02.855 [2024-07-12 21:30:41.398395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:02.855 #53 NEW cov: 11823 ft: 15316 corp: 45/2069b lim: 85 exec/s: 53 rss: 70Mb L: 68/82 MS: 1 InsertRepeatedBytes- 00:08:02.855 [2024-07-12 21:30:41.438003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:02.855 [2024-07-12 21:30:41.438029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:02.855 [2024-07-12 21:30:41.438082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:02.855 [2024-07-12 21:30:41.438097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:02.855 #54 NEW cov: 11823 ft: 15318 corp: 46/2109b lim: 85 exec/s: 27 rss: 70Mb L: 40/82 MS: 1 ShuffleBytes- 00:08:02.855 #54 DONE cov: 11823 ft: 15318 corp: 46/2109b lim: 85 exec/s: 27 rss: 70Mb 00:08:02.855 ###### Recommended dictionary. ###### 00:08:02.855 "\377\015" # Uses: 5 00:08:02.855 "\001\000\000\000\000\000\000\001" # Uses: 0 00:08:02.855 ###### End of recommended dictionary. ###### 00:08:02.855 Done 54 runs in 2 second(s) 00:08:02.855 21:30:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:02.855 21:30:41 -- ../common.sh@72 -- # (( i++ )) 00:08:02.855 21:30:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.855 21:30:41 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:02.855 21:30:41 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:02.855 21:30:41 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.855 21:30:41 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.855 21:30:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:02.855 21:30:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:02.855 21:30:41 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:02.855 21:30:41 -- nvmf/run.sh@29 -- # port=4423 00:08:02.855 21:30:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:02.855 21:30:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:02.855 21:30:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.855 21:30:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:02.855 [2024-07-12 21:30:41.628493] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:02.855 [2024-07-12 21:30:41.628586] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3587958 ] 00:08:03.114 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.114 [2024-07-12 21:30:41.807784] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.114 [2024-07-12 21:30:41.870838] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:03.114 [2024-07-12 21:30:41.870978] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.374 [2024-07-12 21:30:41.929129] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.374 [2024-07-12 21:30:41.945423] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:03.374 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.374 INFO: Seed: 153712557 00:08:03.374 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:03.374 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:03.374 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:03.374 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.374 #2 INITED exec/s: 0 rss: 60Mb 00:08:03.374 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.374 This may also happen if the target rejected all inputs we tried so far 00:08:03.374 [2024-07-12 21:30:42.014546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.374 [2024-07-12 21:30:42.014582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.374 [2024-07-12 21:30:42.014700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.374 [2024-07-12 21:30:42.014723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.633 NEW_FUNC[1/671]: 0x4ab520 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:03.633 NEW_FUNC[2/671]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.633 #8 NEW cov: 11523 ft: 11524 corp: 2/15b lim: 25 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:08:03.633 [2024-07-12 21:30:42.324801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.633 [2024-07-12 21:30:42.324841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.633 [2024-07-12 21:30:42.324899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.633 [2024-07-12 21:30:42.324914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.633 [2024-07-12 21:30:42.324986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.633 [2024-07-12 21:30:42.325002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.633 [2024-07-12 21:30:42.325057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:03.633 [2024-07-12 21:30:42.325072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.633 #9 NEW cov: 11642 ft: 12635 corp: 3/35b lim: 25 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 CopyPart- 00:08:03.633 [2024-07-12 21:30:42.374876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.633 [2024-07-12 21:30:42.374905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.633 [2024-07-12 21:30:42.374948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.633 [2024-07-12 21:30:42.374963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.634 [2024-07-12 21:30:42.375018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.634 [2024-07-12 21:30:42.375033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.634 [2024-07-12 21:30:42.375085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:03.634 [2024-07-12 21:30:42.375100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.634 #11 NEW cov: 11648 ft: 12923 corp: 4/58b lim: 25 exec/s: 0 rss: 67Mb L: 23/23 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:03.634 [2024-07-12 21:30:42.414631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.634 [2024-07-12 21:30:42.414657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.893 #14 NEW cov: 11733 ft: 13639 corp: 5/63b lim: 25 exec/s: 0 rss: 67Mb L: 5/23 MS: 3 CrossOver-CopyPart-InsertByte- 00:08:03.893 [2024-07-12 21:30:42.455124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.893 [2024-07-12 21:30:42.455150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.893 [2024-07-12 21:30:42.455196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.893 [2024-07-12 21:30:42.455211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.893 [2024-07-12 21:30:42.455264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.893 [2024-07-12 21:30:42.455279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.893 [2024-07-12 21:30:42.455333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:03.893 [2024-07-12 21:30:42.455348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.893 #20 NEW cov: 11733 ft: 13696 corp: 6/83b lim: 25 exec/s: 0 rss: 67Mb L: 20/23 MS: 1 ShuffleBytes- 00:08:03.893 [2024-07-12 21:30:42.495212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.893 [2024-07-12 21:30:42.495238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.893 [2024-07-12 21:30:42.495302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.893 [2024-07-12 21:30:42.495318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.495369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.894 [2024-07-12 21:30:42.495385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.495438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:03.894 [2024-07-12 21:30:42.495460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.894 #21 NEW cov: 11733 ft: 13750 corp: 7/107b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CopyPart- 00:08:03.894 [2024-07-12 21:30:42.535333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.894 [2024-07-12 21:30:42.535359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.535406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.894 [2024-07-12 21:30:42.535422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.535479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.894 [2024-07-12 21:30:42.535493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.535548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:03.894 [2024-07-12 21:30:42.535562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.894 #22 NEW cov: 11733 ft: 13868 corp: 8/131b lim: 25 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ChangeBit- 00:08:03.894 [2024-07-12 21:30:42.575390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.894 [2024-07-12 21:30:42.575416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.575457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.894 [2024-07-12 21:30:42.575473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.575544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.894 [2024-07-12 21:30:42.575560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.894 #23 NEW cov: 11733 ft: 14091 corp: 9/146b lim: 25 exec/s: 0 rss: 68Mb L: 15/24 MS: 1 EraseBytes- 00:08:03.894 [2024-07-12 21:30:42.615545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.894 [2024-07-12 21:30:42.615571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.615618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.894 [2024-07-12 21:30:42.615633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.615685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.894 [2024-07-12 21:30:42.615700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.615756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:03.894 [2024-07-12 21:30:42.615771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.894 #24 NEW cov: 11733 ft: 14161 corp: 10/168b lim: 25 exec/s: 0 rss: 68Mb L: 22/24 MS: 1 InsertRepeatedBytes- 00:08:03.894 [2024-07-12 21:30:42.655682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:03.894 [2024-07-12 21:30:42.655708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.655756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:03.894 [2024-07-12 21:30:42.655772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.655825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:03.894 [2024-07-12 21:30:42.655840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:03.894 [2024-07-12 21:30:42.655893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:03.894 [2024-07-12 21:30:42.655908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:03.894 #25 NEW cov: 11733 ft: 14209 corp: 11/191b lim: 25 exec/s: 0 rss: 68Mb L: 23/24 MS: 1 ShuffleBytes- 00:08:04.153 [2024-07-12 21:30:42.695796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.153 [2024-07-12 21:30:42.695823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.153 [2024-07-12 21:30:42.695887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.154 [2024-07-12 21:30:42.695902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.695958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.154 [2024-07-12 21:30:42.695972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.696027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.154 [2024-07-12 21:30:42.696041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.154 #26 NEW cov: 11733 ft: 14294 corp: 12/215b lim: 25 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ShuffleBytes- 00:08:04.154 [2024-07-12 21:30:42.735616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.154 [2024-07-12 21:30:42.735642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.154 #32 NEW cov: 11733 ft: 14344 corp: 13/221b lim: 25 exec/s: 0 rss: 68Mb L: 6/24 MS: 1 CrossOver- 00:08:04.154 [2024-07-12 21:30:42.776162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.154 [2024-07-12 21:30:42.776188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.776241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.154 [2024-07-12 21:30:42.776261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.776315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.154 [2024-07-12 21:30:42.776331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.776384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.154 [2024-07-12 21:30:42.776399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.776456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:04.154 [2024-07-12 21:30:42.776471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.154 #33 NEW cov: 11733 ft: 14415 corp: 14/246b lim: 25 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertByte- 00:08:04.154 [2024-07-12 21:30:42.815798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.154 [2024-07-12 21:30:42.815824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.154 #34 NEW cov: 11733 ft: 14441 corp: 15/251b lim: 25 exec/s: 0 rss: 68Mb L: 5/25 MS: 1 CrossOver- 00:08:04.154 [2024-07-12 21:30:42.856221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.154 [2024-07-12 21:30:42.856248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.856297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.154 [2024-07-12 21:30:42.856312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.856383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.154 [2024-07-12 21:30:42.856398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.856457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.154 [2024-07-12 21:30:42.856471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.154 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.154 #35 NEW cov: 11756 ft: 14470 corp: 16/273b lim: 25 exec/s: 0 rss: 68Mb L: 22/25 MS: 1 CopyPart- 00:08:04.154 [2024-07-12 21:30:42.896178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.154 [2024-07-12 21:30:42.896205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.154 [2024-07-12 21:30:42.896258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.154 [2024-07-12 21:30:42.896273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.154 #36 NEW cov: 11756 ft: 14529 corp: 17/287b lim: 25 exec/s: 0 rss: 68Mb L: 14/25 MS: 1 ChangeByte- 00:08:04.413 [2024-07-12 21:30:42.936239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.413 [2024-07-12 21:30:42.936265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.413 #37 NEW cov: 11756 ft: 14559 corp: 18/292b lim: 25 exec/s: 0 rss: 69Mb L: 5/25 MS: 1 ChangeByte- 00:08:04.413 [2024-07-12 21:30:42.976330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.413 [2024-07-12 21:30:42.976356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.413 #38 NEW cov: 11756 ft: 14576 corp: 19/297b lim: 25 exec/s: 38 rss: 69Mb L: 5/25 MS: 1 ShuffleBytes- 00:08:04.413 [2024-07-12 21:30:43.016815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.413 [2024-07-12 21:30:43.016841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.016885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.413 [2024-07-12 21:30:43.016900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.016954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.413 [2024-07-12 21:30:43.016969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.017024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.413 [2024-07-12 21:30:43.017038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.413 #44 NEW cov: 11756 ft: 14599 corp: 20/318b lim: 25 exec/s: 44 rss: 69Mb L: 21/25 MS: 1 InsertByte- 00:08:04.413 [2024-07-12 21:30:43.056527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.413 [2024-07-12 21:30:43.056552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.413 #45 NEW cov: 11756 ft: 14694 corp: 21/323b lim: 25 exec/s: 45 rss: 69Mb L: 5/25 MS: 1 ChangeBit- 00:08:04.413 [2024-07-12 21:30:43.096695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.413 [2024-07-12 21:30:43.096721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.413 #46 NEW cov: 11756 ft: 14725 corp: 22/330b lim: 25 exec/s: 46 rss: 69Mb L: 7/25 MS: 1 InsertByte- 00:08:04.413 [2024-07-12 21:30:43.137032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.413 [2024-07-12 21:30:43.137058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.137101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.413 [2024-07-12 21:30:43.137116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.137187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.413 [2024-07-12 21:30:43.137203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.413 #49 NEW cov: 11756 ft: 14755 corp: 23/349b lim: 25 exec/s: 49 rss: 69Mb L: 19/25 MS: 3 EraseBytes-InsertByte-InsertRepeatedBytes- 00:08:04.413 [2024-07-12 21:30:43.177261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.413 [2024-07-12 21:30:43.177287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.177335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.413 [2024-07-12 21:30:43.177351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.177405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.413 [2024-07-12 21:30:43.177422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.413 [2024-07-12 21:30:43.177481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.413 [2024-07-12 21:30:43.177497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.672 #50 NEW cov: 11756 ft: 14766 corp: 24/369b lim: 25 exec/s: 50 rss: 69Mb L: 20/25 MS: 1 ChangeByte- 00:08:04.672 [2024-07-12 21:30:43.217347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.672 [2024-07-12 21:30:43.217373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.217421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.672 [2024-07-12 21:30:43.217437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.217495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.672 [2024-07-12 21:30:43.217509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.217564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.672 [2024-07-12 21:30:43.217579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.672 #51 NEW cov: 11756 ft: 14774 corp: 25/391b lim: 25 exec/s: 51 rss: 69Mb L: 22/25 MS: 1 ShuffleBytes- 00:08:04.672 [2024-07-12 21:30:43.257507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.672 [2024-07-12 21:30:43.257536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.257580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.672 [2024-07-12 21:30:43.257595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.257650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.672 [2024-07-12 21:30:43.257665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.257719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.672 [2024-07-12 21:30:43.257732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.672 #52 NEW cov: 11756 ft: 14789 corp: 26/411b lim: 25 exec/s: 52 rss: 69Mb L: 20/25 MS: 1 ShuffleBytes- 00:08:04.672 [2024-07-12 21:30:43.297267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.672 [2024-07-12 21:30:43.297295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.672 #53 NEW cov: 11756 ft: 14823 corp: 27/418b lim: 25 exec/s: 53 rss: 69Mb L: 7/25 MS: 1 ShuffleBytes- 00:08:04.672 [2024-07-12 21:30:43.337355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.672 [2024-07-12 21:30:43.337381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.672 #54 NEW cov: 11756 ft: 14857 corp: 28/425b lim: 25 exec/s: 54 rss: 69Mb L: 7/25 MS: 1 ChangeBit- 00:08:04.672 [2024-07-12 21:30:43.377612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.672 [2024-07-12 21:30:43.377642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.377699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.672 [2024-07-12 21:30:43.377715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.672 #55 NEW cov: 11756 ft: 14858 corp: 29/435b lim: 25 exec/s: 55 rss: 70Mb L: 10/25 MS: 1 CrossOver- 00:08:04.672 [2024-07-12 21:30:43.417953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.672 [2024-07-12 21:30:43.417980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.418021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.672 [2024-07-12 21:30:43.418036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.418092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.672 [2024-07-12 21:30:43.418107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.672 [2024-07-12 21:30:43.418161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.672 [2024-07-12 21:30:43.418177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.931 [2024-07-12 21:30:43.458087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.931 [2024-07-12 21:30:43.458114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.931 [2024-07-12 21:30:43.458159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.931 [2024-07-12 21:30:43.458173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.931 [2024-07-12 21:30:43.458229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.931 [2024-07-12 21:30:43.458244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.931 [2024-07-12 21:30:43.458301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.931 [2024-07-12 21:30:43.458317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.931 #57 NEW cov: 11756 ft: 14871 corp: 30/458b lim: 25 exec/s: 57 rss: 70Mb L: 23/25 MS: 2 ShuffleBytes-ChangeBit- 00:08:04.931 [2024-07-12 21:30:43.498170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.931 [2024-07-12 21:30:43.498196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.931 [2024-07-12 21:30:43.498244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.932 [2024-07-12 21:30:43.498260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.498316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.932 [2024-07-12 21:30:43.498331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.498385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.932 [2024-07-12 21:30:43.498403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.932 #58 NEW cov: 11756 ft: 14876 corp: 31/478b lim: 25 exec/s: 58 rss: 70Mb L: 20/25 MS: 1 InsertRepeatedBytes- 00:08:04.932 [2024-07-12 21:30:43.538355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.932 [2024-07-12 21:30:43.538383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.538424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.932 [2024-07-12 21:30:43.538446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.538504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.932 [2024-07-12 21:30:43.538520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.538578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.932 [2024-07-12 21:30:43.538593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.932 #59 NEW cov: 11756 ft: 14956 corp: 32/502b lim: 25 exec/s: 59 rss: 70Mb L: 24/25 MS: 1 ShuffleBytes- 00:08:04.932 [2024-07-12 21:30:43.578400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.932 [2024-07-12 21:30:43.578428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.578482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.932 [2024-07-12 21:30:43.578498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.578553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.932 [2024-07-12 21:30:43.578568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.578623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.932 [2024-07-12 21:30:43.578637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.932 #60 NEW cov: 11756 ft: 14994 corp: 33/526b lim: 25 exec/s: 60 rss: 70Mb L: 24/25 MS: 1 ChangeBit- 00:08:04.932 [2024-07-12 21:30:43.618668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.932 [2024-07-12 21:30:43.618695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.618744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.932 [2024-07-12 21:30:43.618760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.618813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.932 [2024-07-12 21:30:43.618828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.618880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.932 [2024-07-12 21:30:43.618895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.618949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:04.932 [2024-07-12 21:30:43.618967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.932 #61 NEW cov: 11756 ft: 15000 corp: 34/551b lim: 25 exec/s: 61 rss: 70Mb L: 25/25 MS: 1 InsertByte- 00:08:04.932 [2024-07-12 21:30:43.658813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.932 [2024-07-12 21:30:43.658840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.658891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:04.932 [2024-07-12 21:30:43.658906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.658962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:04.932 [2024-07-12 21:30:43.658978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.659030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:04.932 [2024-07-12 21:30:43.659044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:04.932 [2024-07-12 21:30:43.659097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:04.932 [2024-07-12 21:30:43.659112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:04.932 #62 NEW cov: 11756 ft: 15012 corp: 35/576b lim: 25 exec/s: 62 rss: 70Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:04.932 [2024-07-12 21:30:43.698461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:04.932 [2024-07-12 21:30:43.698488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.191 #63 NEW cov: 11756 ft: 15049 corp: 36/584b lim: 25 exec/s: 63 rss: 70Mb L: 8/25 MS: 1 CopyPart- 00:08:05.191 [2024-07-12 21:30:43.738909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:05.191 [2024-07-12 21:30:43.738936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.738983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:05.191 [2024-07-12 21:30:43.738999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.739052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:05.191 [2024-07-12 21:30:43.739067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.739118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:05.191 [2024-07-12 21:30:43.739134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.191 #64 NEW cov: 11756 ft: 15056 corp: 37/605b lim: 25 exec/s: 64 rss: 70Mb L: 21/25 MS: 1 CMP- DE: "\016\000\000\000\000\000\000\000"- 00:08:05.191 [2024-07-12 21:30:43.779001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:05.191 [2024-07-12 21:30:43.779028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.779078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:05.191 [2024-07-12 21:30:43.779092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.779149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:05.191 [2024-07-12 21:30:43.779164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.779217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:05.191 [2024-07-12 21:30:43.779232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.191 #65 NEW cov: 11756 ft: 15061 corp: 38/626b lim: 25 exec/s: 65 rss: 70Mb L: 21/25 MS: 1 ShuffleBytes- 00:08:05.191 [2024-07-12 21:30:43.819245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:05.191 [2024-07-12 21:30:43.819272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.819322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:05.191 [2024-07-12 21:30:43.819338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.819392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:05.191 [2024-07-12 21:30:43.819407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.819460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:05.191 [2024-07-12 21:30:43.819476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.819530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:05.191 [2024-07-12 21:30:43.819545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:05.191 #66 NEW cov: 11756 ft: 15068 corp: 39/651b lim: 25 exec/s: 66 rss: 70Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:05.191 [2024-07-12 21:30:43.859290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:05.191 [2024-07-12 21:30:43.859316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.859365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:05.191 [2024-07-12 21:30:43.859381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.859432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:05.191 [2024-07-12 21:30:43.859452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.859505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:05.191 [2024-07-12 21:30:43.859520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.191 #67 NEW cov: 11756 ft: 15079 corp: 40/674b lim: 25 exec/s: 67 rss: 70Mb L: 23/25 MS: 1 ChangeBinInt- 00:08:05.191 [2024-07-12 21:30:43.899395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:05.191 [2024-07-12 21:30:43.899421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.899475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:05.191 [2024-07-12 21:30:43.899491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.899545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:05.191 [2024-07-12 21:30:43.899560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.191 [2024-07-12 21:30:43.899615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:05.191 [2024-07-12 21:30:43.899631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.191 #68 NEW cov: 11756 ft: 15113 corp: 41/697b lim: 25 exec/s: 68 rss: 70Mb L: 23/25 MS: 1 ChangeByte- 00:08:05.191 [2024-07-12 21:30:43.939141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:05.191 [2024-07-12 21:30:43.939167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.191 #69 NEW cov: 11756 ft: 15132 corp: 42/704b lim: 25 exec/s: 69 rss: 70Mb L: 7/25 MS: 1 InsertByte- 00:08:05.451 [2024-07-12 21:30:43.979618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:05.451 [2024-07-12 21:30:43.979645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.451 [2024-07-12 21:30:43.979691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:05.451 [2024-07-12 21:30:43.979707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.451 [2024-07-12 21:30:43.979774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:05.451 [2024-07-12 21:30:43.979789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.451 [2024-07-12 21:30:43.979844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:05.451 [2024-07-12 21:30:43.979859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.451 #70 NEW cov: 11756 ft: 15146 corp: 43/726b lim: 25 exec/s: 35 rss: 70Mb L: 22/25 MS: 1 InsertByte- 00:08:05.451 #70 DONE cov: 11756 ft: 15146 corp: 43/726b lim: 25 exec/s: 35 rss: 70Mb 00:08:05.451 ###### Recommended dictionary. ###### 00:08:05.451 "\016\000\000\000\000\000\000\000" # Uses: 0 00:08:05.451 ###### End of recommended dictionary. ###### 00:08:05.451 Done 70 runs in 2 second(s) 00:08:05.451 21:30:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:05.451 21:30:44 -- ../common.sh@72 -- # (( i++ )) 00:08:05.451 21:30:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.451 21:30:44 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:05.451 21:30:44 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:05.451 21:30:44 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.451 21:30:44 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.451 21:30:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:05.451 21:30:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:05.451 21:30:44 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:05.451 21:30:44 -- nvmf/run.sh@29 -- # port=4424 00:08:05.451 21:30:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:05.451 21:30:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:05.451 21:30:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.451 21:30:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:05.451 [2024-07-12 21:30:44.172572] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:05.451 [2024-07-12 21:30:44.172656] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3588499 ] 00:08:05.451 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.710 [2024-07-12 21:30:44.348389] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.710 [2024-07-12 21:30:44.411580] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.710 [2024-07-12 21:30:44.411720] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.710 [2024-07-12 21:30:44.469585] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.710 [2024-07-12 21:30:44.485880] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:05.970 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.970 INFO: Seed: 2696709340 00:08:05.970 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x280a94c, 0x285dea9), 00:08:05.970 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x285deb0,0x2d93480), 00:08:05.970 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:05.970 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.970 #2 INITED exec/s: 0 rss: 60Mb 00:08:05.970 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.970 This may also happen if the target rejected all inputs we tried so far 00:08:05.970 [2024-07-12 21:30:44.552603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.970 [2024-07-12 21:30:44.552637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.970 [2024-07-12 21:30:44.552782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.970 [2024-07-12 21:30:44.552805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.970 [2024-07-12 21:30:44.552944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.970 [2024-07-12 21:30:44.552972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.970 [2024-07-12 21:30:44.553112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.970 [2024-07-12 21:30:44.553139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.229 NEW_FUNC[1/672]: 0x4ac600 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:06.229 NEW_FUNC[2/672]: 0x4bd260 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.229 #7 NEW cov: 11601 ft: 11590 corp: 2/88b lim: 100 exec/s: 0 rss: 66Mb L: 87/87 MS: 5 InsertByte-CrossOver-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:06.229 [2024-07-12 21:30:44.892731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.892773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.229 [2024-07-12 21:30:44.892907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.892928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.229 #8 NEW cov: 11714 ft: 12549 corp: 3/145b lim: 100 exec/s: 0 rss: 67Mb L: 57/87 MS: 1 EraseBytes- 00:08:06.229 [2024-07-12 21:30:44.943386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.943421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.229 [2024-07-12 21:30:44.943533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.943560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.229 [2024-07-12 21:30:44.943688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.943715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.229 [2024-07-12 21:30:44.943841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.943864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.229 #9 NEW cov: 11720 ft: 12874 corp: 4/232b lim: 100 exec/s: 0 rss: 67Mb L: 87/87 MS: 1 ShuffleBytes- 00:08:06.229 [2024-07-12 21:30:44.983541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.983572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.229 [2024-07-12 21:30:44.983657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.983677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.229 [2024-07-12 21:30:44.983790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.983815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.229 [2024-07-12 21:30:44.983935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.229 [2024-07-12 21:30:44.983960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.229 #10 NEW cov: 11805 ft: 13117 corp: 5/319b lim: 100 exec/s: 0 rss: 67Mb L: 87/87 MS: 1 ShuffleBytes- 00:08:06.489 [2024-07-12 21:30:45.032930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.489 [2024-07-12 21:30:45.032956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.489 #13 NEW cov: 11805 ft: 14027 corp: 6/342b lim: 100 exec/s: 0 rss: 67Mb L: 23/87 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:06.489 [2024-07-12 21:30:45.073037] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.489 [2024-07-12 21:30:45.073071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.489 #14 NEW cov: 11805 ft: 14118 corp: 7/365b lim: 100 exec/s: 0 rss: 67Mb L: 23/87 MS: 1 ShuffleBytes- 00:08:06.489 [2024-07-12 21:30:45.123894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.489 [2024-07-12 21:30:45.123923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.489 [2024-07-12 21:30:45.124012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.489 [2024-07-12 21:30:45.124038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.489 [2024-07-12 21:30:45.124155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.489 [2024-07-12 21:30:45.124179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.490 [2024-07-12 21:30:45.124304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.124329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.490 #20 NEW cov: 11805 ft: 14189 corp: 8/452b lim: 100 exec/s: 0 rss: 67Mb L: 87/87 MS: 1 ChangeByte- 00:08:06.490 [2024-07-12 21:30:45.174084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.174116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.490 [2024-07-12 21:30:45.174234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4294967040 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.174255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.490 [2024-07-12 21:30:45.174370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.174392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.490 [2024-07-12 21:30:45.174513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.174532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.490 #21 NEW cov: 11805 ft: 14236 corp: 9/547b lim: 100 exec/s: 0 rss: 68Mb L: 95/95 MS: 1 InsertRepeatedBytes- 00:08:06.490 [2024-07-12 21:30:45.213350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.213380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.490 #22 NEW cov: 11805 ft: 14277 corp: 10/570b lim: 100 exec/s: 0 rss: 68Mb L: 23/95 MS: 1 ChangeBinInt- 00:08:06.490 [2024-07-12 21:30:45.264284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.264315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.490 [2024-07-12 21:30:45.264406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.264429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.490 [2024-07-12 21:30:45.264549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.264574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.490 [2024-07-12 21:30:45.264691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.490 [2024-07-12 21:30:45.264714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.749 #23 NEW cov: 11805 ft: 14316 corp: 11/657b lim: 100 exec/s: 0 rss: 68Mb L: 87/95 MS: 1 CrossOver- 00:08:06.749 [2024-07-12 21:30:45.304385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.304416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.304510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.304537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.304652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.304673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.304784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.304806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.749 #24 NEW cov: 11805 ft: 14406 corp: 12/753b lim: 100 exec/s: 0 rss: 68Mb L: 96/96 MS: 1 CrossOver- 00:08:06.749 [2024-07-12 21:30:45.343782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.343805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.749 #25 NEW cov: 11805 ft: 14427 corp: 13/776b lim: 100 exec/s: 0 rss: 69Mb L: 23/96 MS: 1 ShuffleBytes- 00:08:06.749 [2024-07-12 21:30:45.384695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.384728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.384832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.384849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.384958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.384983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.385103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709489151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.385128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.749 #26 NEW cov: 11805 ft: 14450 corp: 14/872b lim: 100 exec/s: 0 rss: 69Mb L: 96/96 MS: 1 ChangeBinInt- 00:08:06.749 [2024-07-12 21:30:45.424737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.424768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.424877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.424902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.425023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.425046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.749 [2024-07-12 21:30:45.425161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18378064179392151551 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.425181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.749 NEW_FUNC[1/1]: 0x195e300 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.749 #27 NEW cov: 11828 ft: 14565 corp: 15/971b lim: 100 exec/s: 0 rss: 69Mb L: 99/99 MS: 1 CopyPart- 00:08:06.749 [2024-07-12 21:30:45.474192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.474216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.749 #28 NEW cov: 11828 ft: 14596 corp: 16/994b lim: 100 exec/s: 0 rss: 69Mb L: 23/99 MS: 1 CrossOver- 00:08:06.749 [2024-07-12 21:30:45.514257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.749 [2024-07-12 21:30:45.514281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.009 #29 NEW cov: 11828 ft: 14606 corp: 17/1017b lim: 100 exec/s: 29 rss: 69Mb L: 23/99 MS: 1 CopyPart- 00:08:07.009 [2024-07-12 21:30:45.554593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.554627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.009 [2024-07-12 21:30:45.554764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.554790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.009 #30 NEW cov: 11828 ft: 14684 corp: 18/1074b lim: 100 exec/s: 30 rss: 69Mb L: 57/99 MS: 1 CrossOver- 00:08:07.009 [2024-07-12 21:30:45.594529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2031616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.594561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.009 #31 NEW cov: 11828 ft: 14697 corp: 19/1098b lim: 100 exec/s: 31 rss: 70Mb L: 24/99 MS: 1 InsertByte- 00:08:07.009 [2024-07-12 21:30:45.644926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65024 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.644957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.009 [2024-07-12 21:30:45.645074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.645097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.009 #32 NEW cov: 11828 ft: 14722 corp: 20/1155b lim: 100 exec/s: 32 rss: 70Mb L: 57/99 MS: 1 ChangeBit- 00:08:07.009 [2024-07-12 21:30:45.684819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.684845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.009 #33 NEW cov: 11828 ft: 14770 corp: 21/1177b lim: 100 exec/s: 33 rss: 70Mb L: 22/99 MS: 1 EraseBytes- 00:08:07.009 [2024-07-12 21:30:45.724872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.724896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.009 #38 NEW cov: 11828 ft: 14779 corp: 22/1214b lim: 100 exec/s: 38 rss: 70Mb L: 37/99 MS: 5 ChangeByte-ChangeBit-CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:07.009 [2024-07-12 21:30:45.765751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.765777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.009 [2024-07-12 21:30:45.765851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.765875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.009 [2024-07-12 21:30:45.765994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.766019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.009 [2024-07-12 21:30:45.766142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.009 [2024-07-12 21:30:45.766165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.009 #39 NEW cov: 11828 ft: 14787 corp: 23/1301b lim: 100 exec/s: 39 rss: 70Mb L: 87/99 MS: 1 ChangeBinInt- 00:08:07.268 [2024-07-12 21:30:45.806204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.268 [2024-07-12 21:30:45.806236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.268 [2024-07-12 21:30:45.806295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.268 [2024-07-12 21:30:45.806318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.268 [2024-07-12 21:30:45.806436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.268 [2024-07-12 21:30:45.806467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.268 [2024-07-12 21:30:45.806574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18378064179392151551 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:45.806594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.269 [2024-07-12 21:30:45.806710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:45.806728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:07.269 #40 NEW cov: 11828 ft: 14868 corp: 24/1401b lim: 100 exec/s: 40 rss: 70Mb L: 100/100 MS: 1 CopyPart- 00:08:07.269 [2024-07-12 21:30:45.845343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:45.845367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.269 #41 NEW cov: 11828 ft: 14878 corp: 25/1432b lim: 100 exec/s: 41 rss: 70Mb L: 31/100 MS: 1 EraseBytes- 00:08:07.269 [2024-07-12 21:30:45.885593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65024 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:45.885620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.269 [2024-07-12 21:30:45.885728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:45.885754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.269 #42 NEW cov: 11828 ft: 14893 corp: 26/1489b lim: 100 exec/s: 42 rss: 70Mb L: 57/100 MS: 1 CrossOver- 00:08:07.269 [2024-07-12 21:30:45.925497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:45.925525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.269 #43 NEW cov: 11828 ft: 14909 corp: 27/1526b lim: 100 exec/s: 43 rss: 70Mb L: 37/100 MS: 1 CopyPart- 00:08:07.269 [2024-07-12 21:30:45.965630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:65280 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:45.965662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.269 #49 NEW cov: 11828 ft: 14913 corp: 28/1550b lim: 100 exec/s: 49 rss: 70Mb L: 24/100 MS: 1 CrossOver- 00:08:07.269 [2024-07-12 21:30:46.005745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:46.005777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.269 #50 NEW cov: 11828 ft: 14919 corp: 29/1587b lim: 100 exec/s: 50 rss: 70Mb L: 37/100 MS: 1 CopyPart- 00:08:07.269 [2024-07-12 21:30:46.045855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.269 [2024-07-12 21:30:46.045886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.528 #51 NEW cov: 11828 ft: 14937 corp: 30/1610b lim: 100 exec/s: 51 rss: 70Mb L: 23/100 MS: 1 CrossOver- 00:08:07.528 [2024-07-12 21:30:46.086729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.086758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.086850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.086874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.086989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.087009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.087121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.087143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.528 #52 NEW cov: 11828 ft: 14942 corp: 31/1697b lim: 100 exec/s: 52 rss: 70Mb L: 87/100 MS: 1 ChangeBit- 00:08:07.528 [2024-07-12 21:30:46.126157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.126185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.528 #53 NEW cov: 11828 ft: 14947 corp: 32/1725b lim: 100 exec/s: 53 rss: 70Mb L: 28/100 MS: 1 CopyPart- 00:08:07.528 [2024-07-12 21:30:46.167065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.167094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.167200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.167220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.167336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.167357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.167496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.167517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.528 #54 NEW cov: 11828 ft: 14985 corp: 33/1812b lim: 100 exec/s: 54 rss: 70Mb L: 87/100 MS: 1 InsertRepeatedBytes- 00:08:07.528 [2024-07-12 21:30:46.206686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:281470681743360 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.206714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.206786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.206809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.528 #59 NEW cov: 11828 ft: 14991 corp: 34/1856b lim: 100 exec/s: 59 rss: 70Mb L: 44/100 MS: 5 EraseBytes-CMP-ShuffleBytes-ChangeBinInt-CrossOver- DE: "\002\000\000\000\000\000\000\000"- 00:08:07.528 [2024-07-12 21:30:46.246677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.246713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.246837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.246860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.528 #60 NEW cov: 11828 ft: 15016 corp: 35/1915b lim: 100 exec/s: 60 rss: 70Mb L: 59/100 MS: 1 EraseBytes- 00:08:07.528 [2024-07-12 21:30:46.287304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.528 [2024-07-12 21:30:46.287334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.528 [2024-07-12 21:30:46.287451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4294967040 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.529 [2024-07-12 21:30:46.287471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.529 [2024-07-12 21:30:46.287601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.529 [2024-07-12 21:30:46.287624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.529 [2024-07-12 21:30:46.287739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.529 [2024-07-12 21:30:46.287761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.788 #61 NEW cov: 11828 ft: 15036 corp: 36/2010b lim: 100 exec/s: 61 rss: 70Mb L: 95/100 MS: 1 ShuffleBytes- 00:08:07.788 [2024-07-12 21:30:46.336788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:31931402515578880 len:29042 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.336813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.788 #62 NEW cov: 11828 ft: 15104 corp: 37/2044b lim: 100 exec/s: 62 rss: 70Mb L: 34/100 MS: 1 InsertRepeatedBytes- 00:08:07.788 [2024-07-12 21:30:46.386925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.386950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.788 #63 NEW cov: 11828 ft: 15116 corp: 38/2082b lim: 100 exec/s: 63 rss: 70Mb L: 38/100 MS: 1 InsertByte- 00:08:07.788 [2024-07-12 21:30:46.427782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.427815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.788 [2024-07-12 21:30:46.427911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.427934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.788 [2024-07-12 21:30:46.428047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.428069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.788 [2024-07-12 21:30:46.428191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446517574314229759 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.428217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.788 #69 NEW cov: 11828 ft: 15129 corp: 39/2169b lim: 100 exec/s: 69 rss: 70Mb L: 87/100 MS: 1 CopyPart- 00:08:07.788 [2024-07-12 21:30:46.467979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069592517375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.468016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.788 [2024-07-12 21:30:46.468137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.468159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.788 [2024-07-12 21:30:46.468270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.468293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.788 [2024-07-12 21:30:46.468407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.468432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.788 #70 NEW cov: 11828 ft: 15166 corp: 40/2256b lim: 100 exec/s: 70 rss: 70Mb L: 87/100 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:07.788 [2024-07-12 21:30:46.517301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:61441 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.788 [2024-07-12 21:30:46.517328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.788 #71 NEW cov: 11828 ft: 15171 corp: 41/2280b lim: 100 exec/s: 35 rss: 70Mb L: 24/100 MS: 1 InsertByte- 00:08:07.788 #71 DONE cov: 11828 ft: 15171 corp: 41/2280b lim: 100 exec/s: 35 rss: 70Mb 00:08:07.788 ###### Recommended dictionary. ###### 00:08:07.788 "\002\000\000\000\000\000\000\000" # Uses: 1 00:08:07.788 ###### End of recommended dictionary. ###### 00:08:07.788 Done 71 runs in 2 second(s) 00:08:08.048 21:30:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:08.048 21:30:46 -- ../common.sh@72 -- # (( i++ )) 00:08:08.048 21:30:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.048 21:30:46 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:08.048 00:08:08.048 real 1m3.918s 00:08:08.048 user 1m40.385s 00:08:08.048 sys 0m6.969s 00:08:08.048 21:30:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.048 21:30:46 -- common/autotest_common.sh@10 -- # set +x 00:08:08.048 ************************************ 00:08:08.048 END TEST nvmf_fuzz 00:08:08.048 ************************************ 00:08:08.048 21:30:46 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:08.048 21:30:46 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:08.048 21:30:46 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:08.048 21:30:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:08.048 21:30:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:08.048 21:30:46 -- common/autotest_common.sh@10 -- # set +x 00:08:08.048 ************************************ 00:08:08.048 START TEST vfio_fuzz 00:08:08.048 ************************************ 00:08:08.048 21:30:46 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:08.048 * Looking for test storage... 00:08:08.048 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:08.048 21:30:46 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:08.048 21:30:46 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:08.048 21:30:46 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:08.048 21:30:46 -- common/autotest_common.sh@34 -- # set -e 00:08:08.048 21:30:46 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:08.048 21:30:46 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:08.048 21:30:46 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:08.048 21:30:46 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:08.048 21:30:46 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:08.048 21:30:46 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:08.048 21:30:46 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:08.048 21:30:46 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:08.048 21:30:46 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:08.048 21:30:46 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:08.048 21:30:46 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:08.048 21:30:46 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:08.048 21:30:46 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:08.048 21:30:46 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:08.048 21:30:46 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:08.048 21:30:46 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:08.048 21:30:46 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:08.048 21:30:46 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:08.048 21:30:46 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:08.048 21:30:46 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:08.048 21:30:46 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:08.048 21:30:46 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:08.048 21:30:46 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:08.048 21:30:46 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:08.048 21:30:46 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:08.048 21:30:46 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:08.048 21:30:46 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:08.048 21:30:46 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:08.048 21:30:46 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:08.048 21:30:46 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:08.048 21:30:46 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:08.048 21:30:46 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:08.048 21:30:46 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:08.048 21:30:46 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:08.048 21:30:46 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:08.048 21:30:46 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:08.049 21:30:46 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:08.049 21:30:46 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:08.049 21:30:46 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:08.049 21:30:46 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:08.049 21:30:46 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:08.049 21:30:46 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:08.049 21:30:46 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:08.049 21:30:46 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:08.049 21:30:46 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:08.049 21:30:46 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:08.049 21:30:46 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:08.049 21:30:46 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:08.049 21:30:46 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:08.049 21:30:46 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:08.049 21:30:46 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:08.049 21:30:46 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:08.049 21:30:46 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:08.049 21:30:46 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:08.049 21:30:46 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:08.049 21:30:46 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:08.049 21:30:46 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:08.049 21:30:46 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:08.049 21:30:46 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:08.049 21:30:46 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:08.049 21:30:46 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:08.049 21:30:46 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:08.049 21:30:46 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:08.049 21:30:46 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:08.049 21:30:46 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:08:08.049 21:30:46 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:08.049 21:30:46 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:08.049 21:30:46 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:08.049 21:30:46 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:08.049 21:30:46 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:08.049 21:30:46 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:08.049 21:30:46 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:08.049 21:30:46 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:08.049 21:30:46 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:08.049 21:30:46 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:08.049 21:30:46 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:08.049 21:30:46 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:08.049 21:30:46 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:08.049 21:30:46 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:08.049 21:30:46 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:08.049 21:30:46 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:08.049 21:30:46 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:08.049 21:30:46 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:08.049 21:30:46 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:08.049 21:30:46 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:08.310 21:30:46 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:08.310 21:30:46 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:08.310 21:30:46 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:08.310 21:30:46 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:08.310 21:30:46 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:08.310 21:30:46 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:08.310 21:30:46 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:08.310 21:30:46 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:08.310 21:30:46 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:08.310 21:30:46 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:08.310 21:30:46 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:08.310 21:30:46 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:08.310 21:30:46 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:08.310 21:30:46 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:08.310 #define SPDK_CONFIG_H 00:08:08.310 #define SPDK_CONFIG_APPS 1 00:08:08.310 #define SPDK_CONFIG_ARCH native 00:08:08.310 #undef SPDK_CONFIG_ASAN 00:08:08.310 #undef SPDK_CONFIG_AVAHI 00:08:08.310 #undef SPDK_CONFIG_CET 00:08:08.310 #define SPDK_CONFIG_COVERAGE 1 00:08:08.310 #define SPDK_CONFIG_CROSS_PREFIX 00:08:08.310 #undef SPDK_CONFIG_CRYPTO 00:08:08.310 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:08.310 #undef SPDK_CONFIG_CUSTOMOCF 00:08:08.310 #undef SPDK_CONFIG_DAOS 00:08:08.310 #define SPDK_CONFIG_DAOS_DIR 00:08:08.310 #define SPDK_CONFIG_DEBUG 1 00:08:08.310 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:08.310 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:08.310 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:08.310 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:08.310 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:08.310 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:08.310 #define SPDK_CONFIG_EXAMPLES 1 00:08:08.310 #undef SPDK_CONFIG_FC 00:08:08.310 #define SPDK_CONFIG_FC_PATH 00:08:08.310 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:08.310 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:08.310 #undef SPDK_CONFIG_FUSE 00:08:08.310 #define SPDK_CONFIG_FUZZER 1 00:08:08.310 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:08.310 #undef SPDK_CONFIG_GOLANG 00:08:08.310 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:08.310 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:08.310 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:08.310 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:08.310 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:08.310 #define SPDK_CONFIG_IDXD 1 00:08:08.310 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:08.310 #undef SPDK_CONFIG_IPSEC_MB 00:08:08.310 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:08.310 #define SPDK_CONFIG_ISAL 1 00:08:08.310 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:08.310 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:08.310 #define SPDK_CONFIG_LIBDIR 00:08:08.310 #undef SPDK_CONFIG_LTO 00:08:08.310 #define SPDK_CONFIG_MAX_LCORES 00:08:08.310 #define SPDK_CONFIG_NVME_CUSE 1 00:08:08.310 #undef SPDK_CONFIG_OCF 00:08:08.310 #define SPDK_CONFIG_OCF_PATH 00:08:08.310 #define SPDK_CONFIG_OPENSSL_PATH 00:08:08.310 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:08.310 #undef SPDK_CONFIG_PGO_USE 00:08:08.310 #define SPDK_CONFIG_PREFIX /usr/local 00:08:08.310 #undef SPDK_CONFIG_RAID5F 00:08:08.310 #undef SPDK_CONFIG_RBD 00:08:08.310 #define SPDK_CONFIG_RDMA 1 00:08:08.310 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:08.310 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:08.310 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:08.310 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:08.310 #undef SPDK_CONFIG_SHARED 00:08:08.310 #undef SPDK_CONFIG_SMA 00:08:08.310 #define SPDK_CONFIG_TESTS 1 00:08:08.310 #undef SPDK_CONFIG_TSAN 00:08:08.310 #define SPDK_CONFIG_UBLK 1 00:08:08.310 #define SPDK_CONFIG_UBSAN 1 00:08:08.310 #undef SPDK_CONFIG_UNIT_TESTS 00:08:08.310 #undef SPDK_CONFIG_URING 00:08:08.310 #define SPDK_CONFIG_URING_PATH 00:08:08.310 #undef SPDK_CONFIG_URING_ZNS 00:08:08.310 #undef SPDK_CONFIG_USDT 00:08:08.310 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:08.310 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:08.310 #define SPDK_CONFIG_VFIO_USER 1 00:08:08.310 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:08.310 #define SPDK_CONFIG_VHOST 1 00:08:08.310 #define SPDK_CONFIG_VIRTIO 1 00:08:08.310 #undef SPDK_CONFIG_VTUNE 00:08:08.310 #define SPDK_CONFIG_VTUNE_DIR 00:08:08.310 #define SPDK_CONFIG_WERROR 1 00:08:08.310 #define SPDK_CONFIG_WPDK_DIR 00:08:08.310 #undef SPDK_CONFIG_XNVME 00:08:08.310 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:08.310 21:30:46 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:08.310 21:30:46 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:08.310 21:30:46 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:08.310 21:30:46 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:08.310 21:30:46 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:08.310 21:30:46 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:08.310 21:30:46 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:08.310 21:30:46 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:08.310 21:30:46 -- paths/export.sh@5 -- # export PATH 00:08:08.310 21:30:46 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:08.310 21:30:46 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:08.310 21:30:46 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:08.311 21:30:46 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:08.311 21:30:46 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:08.311 21:30:46 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:08.311 21:30:46 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:08.311 21:30:46 -- pm/common@16 -- # TEST_TAG=N/A 00:08:08.311 21:30:46 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:08.311 21:30:46 -- common/autotest_common.sh@52 -- # : 1 00:08:08.311 21:30:46 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:08.311 21:30:46 -- common/autotest_common.sh@56 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:08.311 21:30:46 -- common/autotest_common.sh@58 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:08.311 21:30:46 -- common/autotest_common.sh@60 -- # : 1 00:08:08.311 21:30:46 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:08.311 21:30:46 -- common/autotest_common.sh@62 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:08.311 21:30:46 -- common/autotest_common.sh@64 -- # : 00:08:08.311 21:30:46 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:08.311 21:30:46 -- common/autotest_common.sh@66 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:08.311 21:30:46 -- common/autotest_common.sh@68 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:08.311 21:30:46 -- common/autotest_common.sh@70 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:08.311 21:30:46 -- common/autotest_common.sh@72 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:08.311 21:30:46 -- common/autotest_common.sh@74 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:08.311 21:30:46 -- common/autotest_common.sh@76 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:08.311 21:30:46 -- common/autotest_common.sh@78 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:08.311 21:30:46 -- common/autotest_common.sh@80 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:08.311 21:30:46 -- common/autotest_common.sh@82 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:08.311 21:30:46 -- common/autotest_common.sh@84 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:08.311 21:30:46 -- common/autotest_common.sh@86 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:08.311 21:30:46 -- common/autotest_common.sh@88 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:08.311 21:30:46 -- common/autotest_common.sh@90 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:08.311 21:30:46 -- common/autotest_common.sh@92 -- # : 1 00:08:08.311 21:30:46 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:08.311 21:30:46 -- common/autotest_common.sh@94 -- # : 1 00:08:08.311 21:30:46 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:08.311 21:30:46 -- common/autotest_common.sh@96 -- # : rdma 00:08:08.311 21:30:46 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:08.311 21:30:46 -- common/autotest_common.sh@98 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:08.311 21:30:46 -- common/autotest_common.sh@100 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:08.311 21:30:46 -- common/autotest_common.sh@102 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:08.311 21:30:46 -- common/autotest_common.sh@104 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:08.311 21:30:46 -- common/autotest_common.sh@106 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:08.311 21:30:46 -- common/autotest_common.sh@108 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:08.311 21:30:46 -- common/autotest_common.sh@110 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:08.311 21:30:46 -- common/autotest_common.sh@112 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:08.311 21:30:46 -- common/autotest_common.sh@114 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:08.311 21:30:46 -- common/autotest_common.sh@116 -- # : 1 00:08:08.311 21:30:46 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:08.311 21:30:46 -- common/autotest_common.sh@118 -- # : 00:08:08.311 21:30:46 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:08.311 21:30:46 -- common/autotest_common.sh@120 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:08.311 21:30:46 -- common/autotest_common.sh@122 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:08.311 21:30:46 -- common/autotest_common.sh@124 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:08.311 21:30:46 -- common/autotest_common.sh@126 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:08.311 21:30:46 -- common/autotest_common.sh@128 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:08.311 21:30:46 -- common/autotest_common.sh@130 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:08.311 21:30:46 -- common/autotest_common.sh@132 -- # : 00:08:08.311 21:30:46 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:08.311 21:30:46 -- common/autotest_common.sh@134 -- # : true 00:08:08.311 21:30:46 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:08.311 21:30:46 -- common/autotest_common.sh@136 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:08.311 21:30:46 -- common/autotest_common.sh@138 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:08.311 21:30:46 -- common/autotest_common.sh@140 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:08.311 21:30:46 -- common/autotest_common.sh@142 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:08.311 21:30:46 -- common/autotest_common.sh@144 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:08.311 21:30:46 -- common/autotest_common.sh@146 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:08.311 21:30:46 -- common/autotest_common.sh@148 -- # : 00:08:08.311 21:30:46 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:08.311 21:30:46 -- common/autotest_common.sh@150 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:08.311 21:30:46 -- common/autotest_common.sh@152 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:08.311 21:30:46 -- common/autotest_common.sh@154 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:08.311 21:30:46 -- common/autotest_common.sh@156 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:08.311 21:30:46 -- common/autotest_common.sh@158 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:08.311 21:30:46 -- common/autotest_common.sh@160 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:08.311 21:30:46 -- common/autotest_common.sh@163 -- # : 00:08:08.311 21:30:46 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:08.311 21:30:46 -- common/autotest_common.sh@165 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:08.311 21:30:46 -- common/autotest_common.sh@167 -- # : 0 00:08:08.311 21:30:46 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:08.311 21:30:46 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:08.311 21:30:46 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:08.311 21:30:46 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:08.311 21:30:46 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:08.312 21:30:46 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:08.312 21:30:46 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:08.312 21:30:46 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:08.312 21:30:46 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:08.312 21:30:46 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:08.312 21:30:46 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:08.312 21:30:46 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:08.312 21:30:46 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:08.312 21:30:46 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:08.312 21:30:46 -- common/autotest_common.sh@196 -- # cat 00:08:08.312 21:30:46 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:08.312 21:30:46 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:08.312 21:30:46 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:08.312 21:30:46 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:08.312 21:30:46 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:08.312 21:30:46 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:08.312 21:30:46 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:08.312 21:30:46 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:08.312 21:30:46 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:08.312 21:30:46 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:08.312 21:30:46 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:08.312 21:30:46 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:08.312 21:30:46 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:08.312 21:30:46 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:08.312 21:30:46 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:08.312 21:30:46 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:08.312 21:30:46 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:08.312 21:30:46 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:08.312 21:30:46 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:08.312 21:30:46 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:08.312 21:30:46 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:08.312 21:30:46 -- common/autotest_common.sh@249 -- # valgrind= 00:08:08.312 21:30:46 -- common/autotest_common.sh@255 -- # uname -s 00:08:08.312 21:30:46 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:08.312 21:30:46 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:08.312 21:30:46 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:08.312 21:30:46 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:08.312 21:30:46 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:08.312 21:30:46 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:08.312 21:30:46 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:08.312 21:30:46 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:08.312 21:30:46 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:08.312 21:30:46 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:08.312 21:30:46 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:08.312 21:30:46 -- common/autotest_common.sh@309 -- # [[ -z 3589031 ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@309 -- # kill -0 3589031 00:08:08.312 21:30:46 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:08.312 21:30:46 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:08.312 21:30:46 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:08.312 21:30:46 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:08.312 21:30:46 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:08.312 21:30:46 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:08.312 21:30:46 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:08.312 21:30:46 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.MB93YA 00:08:08.312 21:30:46 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:08.312 21:30:46 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.MB93YA/tests/vfio /tmp/spdk.MB93YA 00:08:08.312 21:30:46 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@318 -- # df -T 00:08:08.312 21:30:46 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:08.312 21:30:46 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:08.312 21:30:46 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # avails["$mount"]=54764761088 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:08:08.312 21:30:46 -- common/autotest_common.sh@354 -- # uses["$mount"]=6977556480 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:08.312 21:30:46 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:08:08.312 21:30:46 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870765568 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:08.312 21:30:46 -- common/autotest_common.sh@354 -- # uses["$mount"]=393216 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:08.312 21:30:46 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:08.312 21:30:46 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:08.312 21:30:46 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:08.312 21:30:46 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:08.312 * Looking for test storage... 00:08:08.312 21:30:46 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:08.312 21:30:46 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:08.312 21:30:46 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:08.312 21:30:46 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:08.312 21:30:46 -- common/autotest_common.sh@363 -- # mount=/ 00:08:08.312 21:30:46 -- common/autotest_common.sh@365 -- # target_space=54764761088 00:08:08.312 21:30:46 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:08.312 21:30:46 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:08.312 21:30:46 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@372 -- # new_size=9192148992 00:08:08.312 21:30:46 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:08.312 21:30:46 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:08.312 21:30:46 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:08.312 21:30:46 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:08.312 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:08.312 21:30:46 -- common/autotest_common.sh@380 -- # return 0 00:08:08.312 21:30:46 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:08.312 21:30:46 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:08.312 21:30:46 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:08.312 21:30:46 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:08.312 21:30:46 -- common/autotest_common.sh@1672 -- # true 00:08:08.312 21:30:46 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:08.312 21:30:46 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:08.312 21:30:46 -- common/autotest_common.sh@27 -- # exec 00:08:08.312 21:30:46 -- common/autotest_common.sh@29 -- # exec 00:08:08.312 21:30:46 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:08.312 21:30:46 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:08.313 21:30:46 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:08.313 21:30:46 -- common/autotest_common.sh@18 -- # set -x 00:08:08.313 21:30:46 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:08.313 21:30:46 -- ../common.sh@8 -- # pids=() 00:08:08.313 21:30:46 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:08.313 21:30:46 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:08.313 21:30:46 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:08.313 21:30:46 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:08.313 21:30:46 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:08.313 21:30:46 -- vfio/run.sh@65 -- # mem_size=0 00:08:08.313 21:30:46 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:08.313 21:30:46 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:08.313 21:30:46 -- ../common.sh@69 -- # local fuzz_num=7 00:08:08.313 21:30:46 -- ../common.sh@70 -- # local time=1 00:08:08.313 21:30:46 -- ../common.sh@72 -- # (( i = 0 )) 00:08:08.313 21:30:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.313 21:30:46 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:08.313 21:30:46 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:08.313 21:30:46 -- vfio/run.sh@23 -- # local timen=1 00:08:08.313 21:30:46 -- vfio/run.sh@24 -- # local core=0x1 00:08:08.313 21:30:46 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:08.313 21:30:46 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:08.313 21:30:46 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:08.313 21:30:46 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:08.313 21:30:46 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:08.313 21:30:46 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:08.313 21:30:46 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:08.313 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:08.313 21:30:46 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:08.313 [2024-07-12 21:30:47.000096] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:08.313 [2024-07-12 21:30:47.000167] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3589111 ] 00:08:08.313 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.313 [2024-07-12 21:30:47.070942] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.572 [2024-07-12 21:30:47.142009] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.572 [2024-07-12 21:30:47.142165] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.572 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.572 INFO: Seed: 1225754215 00:08:08.572 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:08.572 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:08.572 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:08.572 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.572 #2 INITED exec/s: 0 rss: 61Mb 00:08:08.572 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.572 This may also happen if the target rejected all inputs we tried so far 00:08:09.090 NEW_FUNC[1/631]: 0x4806f0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:09.090 NEW_FUNC[2/631]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:09.090 #7 NEW cov: 10698 ft: 10524 corp: 2/11b lim: 60 exec/s: 0 rss: 66Mb L: 10/10 MS: 5 InsertByte-ChangeBinInt-ChangeBinInt-ChangeByte-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:09.349 NEW_FUNC[1/1]: 0x1389560 in nvmf_vfio_user_sq_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5634 00:08:09.349 #8 NEW cov: 10749 ft: 14484 corp: 3/20b lim: 60 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:09.608 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.608 #9 NEW cov: 10769 ft: 15523 corp: 4/30b lim: 60 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:08:09.608 #10 NEW cov: 10769 ft: 16092 corp: 5/40b lim: 60 exec/s: 10 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:08:09.867 #11 NEW cov: 10769 ft: 16616 corp: 6/49b lim: 60 exec/s: 11 rss: 69Mb L: 9/10 MS: 1 CMP- DE: "p\000\000\000\000\000\000\000"- 00:08:10.126 #12 NEW cov: 10773 ft: 17040 corp: 7/59b lim: 60 exec/s: 12 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:08:10.385 #13 NEW cov: 10773 ft: 17363 corp: 8/70b lim: 60 exec/s: 13 rss: 69Mb L: 11/11 MS: 1 CrossOver- 00:08:10.385 #14 NEW cov: 10773 ft: 17614 corp: 9/89b lim: 60 exec/s: 14 rss: 69Mb L: 19/19 MS: 1 PersAutoDict- DE: "p\000\000\000\000\000\000\000"- 00:08:10.643 #15 NEW cov: 10780 ft: 17780 corp: 10/100b lim: 60 exec/s: 15 rss: 69Mb L: 11/19 MS: 1 ShuffleBytes- 00:08:10.902 #16 pulse cov: 10780 ft: 18052 corp: 10/100b lim: 60 exec/s: 8 rss: 69Mb 00:08:10.902 #16 NEW cov: 10780 ft: 18052 corp: 11/109b lim: 60 exec/s: 8 rss: 69Mb L: 9/19 MS: 1 ChangeByte- 00:08:10.902 #16 DONE cov: 10780 ft: 18052 corp: 11/109b lim: 60 exec/s: 8 rss: 69Mb 00:08:10.902 ###### Recommended dictionary. ###### 00:08:10.902 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:10.902 "p\000\000\000\000\000\000\000" # Uses: 1 00:08:10.902 ###### End of recommended dictionary. ###### 00:08:10.902 Done 16 runs in 2 second(s) 00:08:11.162 21:30:49 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:11.162 21:30:49 -- ../common.sh@72 -- # (( i++ )) 00:08:11.162 21:30:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.162 21:30:49 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:11.162 21:30:49 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:11.162 21:30:49 -- vfio/run.sh@23 -- # local timen=1 00:08:11.162 21:30:49 -- vfio/run.sh@24 -- # local core=0x1 00:08:11.162 21:30:49 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:11.162 21:30:49 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:11.162 21:30:49 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:11.162 21:30:49 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:11.162 21:30:49 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:11.162 21:30:49 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:11.162 21:30:49 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:11.162 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:11.162 21:30:49 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:11.162 [2024-07-12 21:30:49.799549] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:11.162 [2024-07-12 21:30:49.799619] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3589570 ] 00:08:11.162 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.162 [2024-07-12 21:30:49.870095] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.162 [2024-07-12 21:30:49.938420] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:11.162 [2024-07-12 21:30:49.938582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.421 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.421 INFO: Seed: 4021751686 00:08:11.421 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:11.421 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:11.421 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:11.421 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.421 #2 INITED exec/s: 0 rss: 62Mb 00:08:11.421 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.421 This may also happen if the target rejected all inputs we tried so far 00:08:11.680 [2024-07-12 21:30:50.245524] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:11.680 [2024-07-12 21:30:50.245556] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:11.680 [2024-07-12 21:30:50.245574] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:11.939 NEW_FUNC[1/638]: 0x480c90 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:11.939 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:11.939 #3 NEW cov: 10727 ft: 10525 corp: 2/10b lim: 40 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CMP- DE: "\001\000\000\000\000\000\002i"- 00:08:11.939 [2024-07-12 21:30:50.704260] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:11.939 [2024-07-12 21:30:50.704294] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:11.939 [2024-07-12 21:30:50.704311] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:12.198 #4 NEW cov: 10741 ft: 13327 corp: 3/27b lim: 40 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:08:12.198 [2024-07-12 21:30:50.885547] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:12.198 [2024-07-12 21:30:50.885575] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:12.198 [2024-07-12 21:30:50.885593] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:12.457 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:12.457 #5 NEW cov: 10758 ft: 14376 corp: 4/36b lim: 40 exec/s: 0 rss: 69Mb L: 9/17 MS: 1 ShuffleBytes- 00:08:12.457 [2024-07-12 21:30:51.066171] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:12.457 [2024-07-12 21:30:51.066193] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:12.457 [2024-07-12 21:30:51.066210] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:12.457 #6 NEW cov: 10758 ft: 14973 corp: 5/53b lim: 40 exec/s: 6 rss: 69Mb L: 17/17 MS: 1 ChangeBit- 00:08:12.716 [2024-07-12 21:30:51.245821] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:12.716 [2024-07-12 21:30:51.245844] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:12.716 [2024-07-12 21:30:51.245861] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:12.716 #7 NEW cov: 10758 ft: 15917 corp: 6/70b lim: 40 exec/s: 7 rss: 69Mb L: 17/17 MS: 1 ChangeByte- 00:08:12.716 [2024-07-12 21:30:51.421141] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:12.716 [2024-07-12 21:30:51.421162] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:12.716 [2024-07-12 21:30:51.421180] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:12.974 #8 NEW cov: 10758 ft: 16112 corp: 7/87b lim: 40 exec/s: 8 rss: 69Mb L: 17/17 MS: 1 ChangeBit- 00:08:12.974 [2024-07-12 21:30:51.594591] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:12.974 [2024-07-12 21:30:51.594612] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:12.974 [2024-07-12 21:30:51.594630] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:12.974 #9 NEW cov: 10758 ft: 16329 corp: 8/104b lim: 40 exec/s: 9 rss: 69Mb L: 17/17 MS: 1 ChangeBinInt- 00:08:13.232 [2024-07-12 21:30:51.768159] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:13.232 [2024-07-12 21:30:51.768180] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:13.232 [2024-07-12 21:30:51.768196] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:13.232 #10 NEW cov: 10758 ft: 16644 corp: 9/113b lim: 40 exec/s: 10 rss: 69Mb L: 9/17 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\002i"- 00:08:13.232 [2024-07-12 21:30:51.941669] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:13.232 [2024-07-12 21:30:51.941691] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:13.232 [2024-07-12 21:30:51.941708] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:13.491 #11 NEW cov: 10765 ft: 17183 corp: 10/122b lim: 40 exec/s: 11 rss: 69Mb L: 9/17 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\002i"- 00:08:13.491 [2024-07-12 21:30:52.112916] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:13.491 [2024-07-12 21:30:52.112937] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:13.491 [2024-07-12 21:30:52.112954] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:13.491 #12 NEW cov: 10765 ft: 17483 corp: 11/139b lim: 40 exec/s: 6 rss: 69Mb L: 17/17 MS: 1 ShuffleBytes- 00:08:13.491 #12 DONE cov: 10765 ft: 17483 corp: 11/139b lim: 40 exec/s: 6 rss: 69Mb 00:08:13.491 ###### Recommended dictionary. ###### 00:08:13.491 "\001\000\000\000\000\000\002i" # Uses: 2 00:08:13.491 ###### End of recommended dictionary. ###### 00:08:13.491 Done 12 runs in 2 second(s) 00:08:13.750 21:30:52 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:13.750 21:30:52 -- ../common.sh@72 -- # (( i++ )) 00:08:13.750 21:30:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.750 21:30:52 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:13.750 21:30:52 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:13.750 21:30:52 -- vfio/run.sh@23 -- # local timen=1 00:08:13.750 21:30:52 -- vfio/run.sh@24 -- # local core=0x1 00:08:13.750 21:30:52 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:13.750 21:30:52 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:13.750 21:30:52 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:13.750 21:30:52 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:13.750 21:30:52 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:13.750 21:30:52 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:13.750 21:30:52 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:13.750 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:13.750 21:30:52 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:13.750 [2024-07-12 21:30:52.522953] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:13.750 [2024-07-12 21:30:52.523031] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3589955 ] 00:08:14.009 EAL: No free 2048 kB hugepages reported on node 1 00:08:14.009 [2024-07-12 21:30:52.596038] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.009 [2024-07-12 21:30:52.665259] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:14.009 [2024-07-12 21:30:52.665417] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.273 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.273 INFO: Seed: 2453791972 00:08:14.273 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:14.273 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:14.274 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:14.274 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.274 #2 INITED exec/s: 0 rss: 62Mb 00:08:14.274 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:14.274 This may also happen if the target rejected all inputs we tried so far 00:08:14.274 [2024-07-12 21:30:52.965479] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:14.274 [2024-07-12 21:30:52.965526] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:14.790 NEW_FUNC[1/637]: 0x481670 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:14.790 NEW_FUNC[2/637]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:14.790 #16 NEW cov: 10711 ft: 10666 corp: 2/31b lim: 80 exec/s: 0 rss: 66Mb L: 30/30 MS: 4 CrossOver-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:14.790 [2024-07-12 21:30:53.421209] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:14.790 [2024-07-12 21:30:53.421250] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:14.790 #17 NEW cov: 10728 ft: 14187 corp: 3/62b lim: 80 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertByte- 00:08:15.048 [2024-07-12 21:30:53.595157] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:15.048 [2024-07-12 21:30:53.595186] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:15.048 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:15.048 #18 NEW cov: 10745 ft: 14854 corp: 4/92b lim: 80 exec/s: 0 rss: 69Mb L: 30/31 MS: 1 ChangeByte- 00:08:15.048 [2024-07-12 21:30:53.769135] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:15.048 [2024-07-12 21:30:53.769164] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:15.307 #19 NEW cov: 10745 ft: 16031 corp: 5/122b lim: 80 exec/s: 19 rss: 69Mb L: 30/31 MS: 1 ChangeBit- 00:08:15.307 [2024-07-12 21:30:53.945081] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:15.307 [2024-07-12 21:30:53.945109] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:15.307 #20 NEW cov: 10745 ft: 16284 corp: 6/156b lim: 80 exec/s: 20 rss: 69Mb L: 34/34 MS: 1 CrossOver- 00:08:15.565 [2024-07-12 21:30:54.118934] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:15.565 [2024-07-12 21:30:54.118963] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:15.565 #21 NEW cov: 10745 ft: 16395 corp: 7/216b lim: 80 exec/s: 21 rss: 69Mb L: 60/60 MS: 1 CrossOver- 00:08:15.565 [2024-07-12 21:30:54.290791] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:15.565 [2024-07-12 21:30:54.290820] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:15.824 #22 NEW cov: 10745 ft: 16718 corp: 8/278b lim: 80 exec/s: 22 rss: 69Mb L: 62/62 MS: 1 InsertRepeatedBytes- 00:08:15.824 [2024-07-12 21:30:54.463739] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:15.824 [2024-07-12 21:30:54.463767] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:15.824 #23 NEW cov: 10745 ft: 16788 corp: 9/309b lim: 80 exec/s: 23 rss: 69Mb L: 31/62 MS: 1 CMP- DE: "\021\000"- 00:08:16.083 [2024-07-12 21:30:54.637830] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:16.083 [2024-07-12 21:30:54.637858] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:16.083 #24 NEW cov: 10752 ft: 17129 corp: 10/339b lim: 80 exec/s: 24 rss: 69Mb L: 30/62 MS: 1 ShuffleBytes- 00:08:16.083 [2024-07-12 21:30:54.810582] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:16.083 [2024-07-12 21:30:54.810614] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:16.341 #25 NEW cov: 10752 ft: 17340 corp: 11/382b lim: 80 exec/s: 12 rss: 69Mb L: 43/62 MS: 1 CopyPart- 00:08:16.341 #25 DONE cov: 10752 ft: 17340 corp: 11/382b lim: 80 exec/s: 12 rss: 69Mb 00:08:16.341 ###### Recommended dictionary. ###### 00:08:16.341 "\021\000" # Uses: 0 00:08:16.341 ###### End of recommended dictionary. ###### 00:08:16.341 Done 25 runs in 2 second(s) 00:08:16.600 21:30:55 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:16.600 21:30:55 -- ../common.sh@72 -- # (( i++ )) 00:08:16.600 21:30:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.600 21:30:55 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:16.600 21:30:55 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:16.600 21:30:55 -- vfio/run.sh@23 -- # local timen=1 00:08:16.600 21:30:55 -- vfio/run.sh@24 -- # local core=0x1 00:08:16.600 21:30:55 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:16.600 21:30:55 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:16.600 21:30:55 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:16.600 21:30:55 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:16.600 21:30:55 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:16.600 21:30:55 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:16.600 21:30:55 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:16.600 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:16.600 21:30:55 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:16.600 [2024-07-12 21:30:55.215233] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:16.600 [2024-07-12 21:30:55.215312] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3590495 ] 00:08:16.600 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.600 [2024-07-12 21:30:55.288168] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.600 [2024-07-12 21:30:55.355944] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:16.600 [2024-07-12 21:30:55.356105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.859 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.859 INFO: Seed: 844826201 00:08:16.859 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:16.859 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:16.859 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:16.859 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.859 #2 INITED exec/s: 0 rss: 61Mb 00:08:16.859 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.859 This may also happen if the target rejected all inputs we tried so far 00:08:17.376 NEW_FUNC[1/632]: 0x481d50 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:17.376 NEW_FUNC[2/632]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:17.376 #20 NEW cov: 10699 ft: 10671 corp: 2/54b lim: 320 exec/s: 0 rss: 66Mb L: 53/53 MS: 3 CopyPart-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:17.634 #21 NEW cov: 10717 ft: 14240 corp: 3/107b lim: 320 exec/s: 0 rss: 68Mb L: 53/53 MS: 1 ChangeBit- 00:08:17.892 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:17.892 #22 NEW cov: 10734 ft: 15121 corp: 4/160b lim: 320 exec/s: 0 rss: 69Mb L: 53/53 MS: 1 CrossOver- 00:08:17.892 #28 NEW cov: 10734 ft: 15650 corp: 5/213b lim: 320 exec/s: 28 rss: 69Mb L: 53/53 MS: 1 ShuffleBytes- 00:08:18.150 #29 NEW cov: 10734 ft: 16016 corp: 6/247b lim: 320 exec/s: 29 rss: 69Mb L: 34/53 MS: 1 EraseBytes- 00:08:18.408 #30 NEW cov: 10734 ft: 16414 corp: 7/300b lim: 320 exec/s: 30 rss: 69Mb L: 53/53 MS: 1 ChangeBinInt- 00:08:18.408 #31 NEW cov: 10734 ft: 16814 corp: 8/406b lim: 320 exec/s: 31 rss: 69Mb L: 106/106 MS: 1 CrossOver- 00:08:18.666 #32 NEW cov: 10734 ft: 16898 corp: 9/441b lim: 320 exec/s: 32 rss: 69Mb L: 35/106 MS: 1 InsertByte- 00:08:18.925 #33 NEW cov: 10741 ft: 17047 corp: 10/494b lim: 320 exec/s: 33 rss: 69Mb L: 53/106 MS: 1 ChangeBit- 00:08:19.184 #34 NEW cov: 10741 ft: 17139 corp: 11/547b lim: 320 exec/s: 17 rss: 69Mb L: 53/106 MS: 1 ShuffleBytes- 00:08:19.184 #34 DONE cov: 10741 ft: 17139 corp: 11/547b lim: 320 exec/s: 17 rss: 69Mb 00:08:19.184 Done 34 runs in 2 second(s) 00:08:19.443 21:30:57 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:19.443 21:30:57 -- ../common.sh@72 -- # (( i++ )) 00:08:19.443 21:30:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.443 21:30:57 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:19.443 21:30:57 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:19.443 21:30:57 -- vfio/run.sh@23 -- # local timen=1 00:08:19.443 21:30:57 -- vfio/run.sh@24 -- # local core=0x1 00:08:19.443 21:30:57 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:19.443 21:30:57 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:19.443 21:30:57 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:19.443 21:30:57 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:19.443 21:30:57 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:19.443 21:30:57 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:19.443 21:30:57 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:19.443 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:19.443 21:30:57 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:19.443 [2024-07-12 21:30:58.015376] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:19.443 [2024-07-12 21:30:58.015473] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3591033 ] 00:08:19.443 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.443 [2024-07-12 21:30:58.089274] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.443 [2024-07-12 21:30:58.159641] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:19.443 [2024-07-12 21:30:58.159795] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.701 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.701 INFO: Seed: 3650841528 00:08:19.702 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:19.702 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:19.702 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:19.702 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.702 #2 INITED exec/s: 0 rss: 61Mb 00:08:19.702 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.702 This may also happen if the target rejected all inputs we tried so far 00:08:20.218 NEW_FUNC[1/632]: 0x4825d0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:20.218 NEW_FUNC[2/632]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:20.218 #27 NEW cov: 10700 ft: 10596 corp: 2/142b lim: 320 exec/s: 0 rss: 67Mb L: 141/141 MS: 5 ShuffleBytes-ChangeBit-InsertRepeatedBytes-ChangeBit-InsertRepeatedBytes- 00:08:20.476 #28 NEW cov: 10719 ft: 13637 corp: 3/283b lim: 320 exec/s: 0 rss: 68Mb L: 141/141 MS: 1 ChangeByte- 00:08:20.476 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:20.476 #29 NEW cov: 10736 ft: 15151 corp: 4/424b lim: 320 exec/s: 0 rss: 69Mb L: 141/141 MS: 1 CopyPart- 00:08:20.735 #30 NEW cov: 10736 ft: 15992 corp: 5/565b lim: 320 exec/s: 30 rss: 69Mb L: 141/141 MS: 1 ChangeBit- 00:08:20.994 #31 NEW cov: 10736 ft: 16126 corp: 6/780b lim: 320 exec/s: 31 rss: 69Mb L: 215/215 MS: 1 CrossOver- 00:08:20.994 #32 NEW cov: 10736 ft: 16433 corp: 7/921b lim: 320 exec/s: 32 rss: 69Mb L: 141/215 MS: 1 CMP- DE: "\001\000\000\000\000\000\000u"- 00:08:21.253 #34 NEW cov: 10736 ft: 16812 corp: 8/968b lim: 320 exec/s: 34 rss: 69Mb L: 47/215 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:21.513 #35 NEW cov: 10736 ft: 17233 corp: 9/1183b lim: 320 exec/s: 35 rss: 69Mb L: 215/215 MS: 1 ChangeBinInt- 00:08:21.513 #36 NEW cov: 10743 ft: 17303 corp: 10/1332b lim: 320 exec/s: 36 rss: 69Mb L: 149/215 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000u"- 00:08:21.772 #37 NEW cov: 10743 ft: 17479 corp: 11/1474b lim: 320 exec/s: 18 rss: 69Mb L: 142/215 MS: 1 InsertByte- 00:08:21.772 #37 DONE cov: 10743 ft: 17479 corp: 11/1474b lim: 320 exec/s: 18 rss: 69Mb 00:08:21.772 ###### Recommended dictionary. ###### 00:08:21.772 "\001\000\000\000\000\000\000u" # Uses: 1 00:08:21.772 ###### End of recommended dictionary. ###### 00:08:21.772 Done 37 runs in 2 second(s) 00:08:22.032 21:31:00 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:22.032 21:31:00 -- ../common.sh@72 -- # (( i++ )) 00:08:22.032 21:31:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.032 21:31:00 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:22.032 21:31:00 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:22.032 21:31:00 -- vfio/run.sh@23 -- # local timen=1 00:08:22.032 21:31:00 -- vfio/run.sh@24 -- # local core=0x1 00:08:22.032 21:31:00 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:22.032 21:31:00 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:22.032 21:31:00 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:22.032 21:31:00 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:22.032 21:31:00 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:22.032 21:31:00 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:22.032 21:31:00 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:22.032 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:22.032 21:31:00 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:22.032 [2024-07-12 21:31:00.729123] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:22.032 [2024-07-12 21:31:00.729191] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3591586 ] 00:08:22.032 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.032 [2024-07-12 21:31:00.801005] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.291 [2024-07-12 21:31:00.870312] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.291 [2024-07-12 21:31:00.870490] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.291 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.291 INFO: Seed: 2067880393 00:08:22.291 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:22.291 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:22.291 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:22.291 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.291 #2 INITED exec/s: 0 rss: 61Mb 00:08:22.291 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.291 This may also happen if the target rejected all inputs we tried so far 00:08:22.550 [2024-07-12 21:31:01.151297] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:22.550 [2024-07-12 21:31:01.151340] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:22.810 NEW_FUNC[1/638]: 0x482fd0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:22.810 NEW_FUNC[2/638]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:22.810 #7 NEW cov: 10730 ft: 10569 corp: 2/14b lim: 120 exec/s: 0 rss: 67Mb L: 13/13 MS: 5 CopyPart-ChangeBit-CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:08:23.069 [2024-07-12 21:31:01.612749] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:23.069 [2024-07-12 21:31:01.612793] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:23.069 #8 NEW cov: 10747 ft: 13457 corp: 3/123b lim: 120 exec/s: 0 rss: 68Mb L: 109/109 MS: 1 InsertRepeatedBytes- 00:08:23.069 [2024-07-12 21:31:01.815142] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:23.069 [2024-07-12 21:31:01.815173] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:23.328 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.328 #11 NEW cov: 10764 ft: 14398 corp: 4/181b lim: 120 exec/s: 0 rss: 69Mb L: 58/109 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:23.328 [2024-07-12 21:31:02.005816] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:23.328 [2024-07-12 21:31:02.005846] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:23.587 #12 NEW cov: 10764 ft: 15676 corp: 5/194b lim: 120 exec/s: 12 rss: 69Mb L: 13/109 MS: 1 ChangeBinInt- 00:08:23.587 [2024-07-12 21:31:02.194588] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:23.587 [2024-07-12 21:31:02.194620] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:23.587 #13 NEW cov: 10764 ft: 16012 corp: 6/303b lim: 120 exec/s: 13 rss: 69Mb L: 109/109 MS: 1 ShuffleBytes- 00:08:23.863 [2024-07-12 21:31:02.385478] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:23.863 [2024-07-12 21:31:02.385518] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:23.863 #14 NEW cov: 10764 ft: 16027 corp: 7/412b lim: 120 exec/s: 14 rss: 69Mb L: 109/109 MS: 1 ShuffleBytes- 00:08:23.863 [2024-07-12 21:31:02.575109] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:23.863 [2024-07-12 21:31:02.575139] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:24.260 #15 NEW cov: 10764 ft: 16432 corp: 8/426b lim: 120 exec/s: 15 rss: 69Mb L: 14/109 MS: 1 InsertByte- 00:08:24.260 [2024-07-12 21:31:02.766926] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:24.260 [2024-07-12 21:31:02.766956] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:24.260 #21 NEW cov: 10771 ft: 16538 corp: 9/535b lim: 120 exec/s: 21 rss: 69Mb L: 109/109 MS: 1 ChangeByte- 00:08:24.260 [2024-07-12 21:31:02.959138] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:24.260 [2024-07-12 21:31:02.959168] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:24.518 #22 NEW cov: 10771 ft: 16862 corp: 10/647b lim: 120 exec/s: 11 rss: 69Mb L: 112/112 MS: 1 InsertRepeatedBytes- 00:08:24.518 #22 DONE cov: 10771 ft: 16862 corp: 10/647b lim: 120 exec/s: 11 rss: 69Mb 00:08:24.518 Done 22 runs in 2 second(s) 00:08:24.777 21:31:03 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:24.777 21:31:03 -- ../common.sh@72 -- # (( i++ )) 00:08:24.777 21:31:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.777 21:31:03 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:24.777 21:31:03 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:24.777 21:31:03 -- vfio/run.sh@23 -- # local timen=1 00:08:24.777 21:31:03 -- vfio/run.sh@24 -- # local core=0x1 00:08:24.777 21:31:03 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:24.778 21:31:03 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:24.778 21:31:03 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:24.778 21:31:03 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:24.778 21:31:03 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:24.778 21:31:03 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:24.778 21:31:03 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:24.778 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:24.778 21:31:03 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:24.778 [2024-07-12 21:31:03.378480] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:24.778 [2024-07-12 21:31:03.378548] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3591925 ] 00:08:24.778 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.778 [2024-07-12 21:31:03.449558] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.778 [2024-07-12 21:31:03.518302] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:24.778 [2024-07-12 21:31:03.518463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.037 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.037 INFO: Seed: 420887292 00:08:25.037 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x27cc1cc, 0x281ec63), 00:08:25.037 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x281ec68,0x2d495d8), 00:08:25.037 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:25.037 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.037 #2 INITED exec/s: 0 rss: 61Mb 00:08:25.037 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.037 This may also happen if the target rejected all inputs we tried so far 00:08:25.037 [2024-07-12 21:31:03.806536] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:25.037 [2024-07-12 21:31:03.806577] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:25.556 NEW_FUNC[1/637]: 0x483cc0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:25.556 NEW_FUNC[2/637]: 0x486290 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:25.556 #8 NEW cov: 10693 ft: 10564 corp: 2/22b lim: 90 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:25.556 [2024-07-12 21:31:04.263433] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:25.556 [2024-07-12 21:31:04.263484] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:25.815 NEW_FUNC[1/1]: 0x494d90 in malloc_completion_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/bdev/malloc/bdev_malloc.c:849 00:08:25.815 #9 NEW cov: 10735 ft: 14056 corp: 3/44b lim: 90 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 InsertByte- 00:08:25.815 [2024-07-12 21:31:04.446470] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:25.815 [2024-07-12 21:31:04.446500] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:25.815 NEW_FUNC[1/1]: 0x192aaa0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:25.815 #11 NEW cov: 10752 ft: 14954 corp: 4/99b lim: 90 exec/s: 0 rss: 69Mb L: 55/55 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:26.074 [2024-07-12 21:31:04.638732] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.074 [2024-07-12 21:31:04.638760] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.074 #12 NEW cov: 10752 ft: 15975 corp: 5/120b lim: 90 exec/s: 12 rss: 69Mb L: 21/55 MS: 1 ShuffleBytes- 00:08:26.074 [2024-07-12 21:31:04.819664] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.074 [2024-07-12 21:31:04.819692] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.333 #13 NEW cov: 10752 ft: 16419 corp: 6/193b lim: 90 exec/s: 13 rss: 69Mb L: 73/73 MS: 1 CrossOver- 00:08:26.333 [2024-07-12 21:31:05.002916] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.333 [2024-07-12 21:31:05.002946] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.333 #14 NEW cov: 10752 ft: 16509 corp: 7/215b lim: 90 exec/s: 14 rss: 69Mb L: 22/73 MS: 1 InsertByte- 00:08:26.592 [2024-07-12 21:31:05.184103] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.592 [2024-07-12 21:31:05.184134] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.592 #15 NEW cov: 10752 ft: 16720 corp: 8/235b lim: 90 exec/s: 15 rss: 70Mb L: 20/73 MS: 1 EraseBytes- 00:08:26.592 [2024-07-12 21:31:05.366120] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.592 [2024-07-12 21:31:05.366149] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:26.852 #16 NEW cov: 10752 ft: 17020 corp: 9/256b lim: 90 exec/s: 16 rss: 70Mb L: 21/73 MS: 1 CopyPart- 00:08:26.852 [2024-07-12 21:31:05.547108] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:26.852 [2024-07-12 21:31:05.547138] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:27.111 #17 NEW cov: 10759 ft: 17122 corp: 10/329b lim: 90 exec/s: 17 rss: 70Mb L: 73/73 MS: 1 ShuffleBytes- 00:08:27.111 [2024-07-12 21:31:05.729781] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:27.111 [2024-07-12 21:31:05.729810] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:27.111 #18 NEW cov: 10759 ft: 17442 corp: 11/402b lim: 90 exec/s: 9 rss: 70Mb L: 73/73 MS: 1 CopyPart- 00:08:27.111 #18 DONE cov: 10759 ft: 17442 corp: 11/402b lim: 90 exec/s: 9 rss: 70Mb 00:08:27.111 Done 18 runs in 2 second(s) 00:08:27.370 21:31:06 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:27.370 21:31:06 -- ../common.sh@72 -- # (( i++ )) 00:08:27.370 21:31:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.370 21:31:06 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:27.370 00:08:27.370 real 0m19.396s 00:08:27.370 user 0m27.299s 00:08:27.370 sys 0m1.746s 00:08:27.370 21:31:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.370 21:31:06 -- common/autotest_common.sh@10 -- # set +x 00:08:27.370 ************************************ 00:08:27.370 END TEST vfio_fuzz 00:08:27.370 ************************************ 00:08:27.370 21:31:06 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:27.370 00:08:27.370 real 1m23.517s 00:08:27.370 user 2m7.761s 00:08:27.370 sys 0m8.865s 00:08:27.370 21:31:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.370 21:31:06 -- common/autotest_common.sh@10 -- # set +x 00:08:27.370 ************************************ 00:08:27.370 END TEST llvm_fuzz 00:08:27.370 ************************************ 00:08:27.627 21:31:06 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:08:27.627 21:31:06 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:08:27.627 21:31:06 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:08:27.627 21:31:06 -- common/autotest_common.sh@712 -- # xtrace_disable 00:08:27.627 21:31:06 -- common/autotest_common.sh@10 -- # set +x 00:08:27.627 21:31:06 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:08:27.627 21:31:06 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:08:27.627 21:31:06 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:08:27.627 21:31:06 -- common/autotest_common.sh@10 -- # set +x 00:08:34.188 INFO: APP EXITING 00:08:34.188 INFO: killing all VMs 00:08:34.188 INFO: killing vhost app 00:08:34.188 INFO: EXIT DONE 00:08:36.094 Waiting for block devices as requested 00:08:36.094 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:36.353 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:36.353 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:36.353 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:36.612 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:36.612 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:36.612 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:36.612 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:36.871 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:36.871 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:36.871 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:37.129 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:37.129 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:37.129 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:37.388 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:37.388 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:37.388 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:41.592 Cleaning 00:08:41.592 Removing: /dev/shm/spdk_tgt_trace.pid3554566 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3552071 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3553356 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3554566 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3555213 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3555491 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3555812 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3556154 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3556408 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3556676 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3556958 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3557277 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3558141 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3561337 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3561634 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3561935 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3561949 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3562547 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3562786 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3563362 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3563381 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3563671 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3563939 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3564085 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3564255 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3564719 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3564908 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3565201 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3565519 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3565703 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3565840 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3565907 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3566173 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3566460 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3566732 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3566959 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3567130 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3567340 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3567598 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3567879 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3568168 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3568451 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3568722 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3568936 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3569099 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3569313 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3569582 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3569870 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3570137 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3570420 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3570693 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3570921 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3571085 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3571303 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3571559 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3571840 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3572114 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3572395 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3572669 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3572925 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3573089 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3573281 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3573535 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3573818 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3574091 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3574387 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3574656 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3574943 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3575177 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3575379 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3575543 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3575812 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3576107 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3576467 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3576931 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3577474 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3577931 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3578317 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3578854 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3579227 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3579685 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3580294 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3580720 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3581612 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3582160 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3582463 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3582996 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3583516 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3583832 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3584368 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3584842 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3585204 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3585747 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3586177 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3586577 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3587118 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3587628 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3587958 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3588499 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3589111 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3589570 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3589955 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3590495 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3591033 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3591586 00:08:41.592 Removing: /var/run/dpdk/spdk_pid3591925 00:08:41.592 Clean 00:08:41.592 killing process with pid 3507890 00:08:44.886 killing process with pid 3507887 00:08:44.886 killing process with pid 3507889 00:08:44.886 killing process with pid 3507888 00:08:44.886 21:31:23 -- common/autotest_common.sh@1436 -- # return 0 00:08:44.886 21:31:23 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:08:44.886 21:31:23 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:44.886 21:31:23 -- common/autotest_common.sh@10 -- # set +x 00:08:44.886 21:31:23 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:08:44.886 21:31:23 -- common/autotest_common.sh@718 -- # xtrace_disable 00:08:44.886 21:31:23 -- common/autotest_common.sh@10 -- # set +x 00:08:45.145 21:31:23 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:45.145 21:31:23 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:45.145 21:31:23 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:45.145 21:31:23 -- spdk/autotest.sh@394 -- # hash lcov 00:08:45.145 21:31:23 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:08:45.145 21:31:23 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:45.145 21:31:23 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:08:45.145 21:31:23 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:45.145 21:31:23 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:45.145 21:31:23 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:45.145 21:31:23 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:45.145 21:31:23 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:45.145 21:31:23 -- paths/export.sh@5 -- $ export PATH 00:08:45.145 21:31:23 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:45.145 21:31:23 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:08:45.145 21:31:23 -- common/autobuild_common.sh@435 -- $ date +%s 00:08:45.145 21:31:23 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720812683.XXXXXX 00:08:45.145 21:31:23 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720812683.dbp0Fe 00:08:45.145 21:31:23 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:08:45.145 21:31:23 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:08:45.145 21:31:23 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:08:45.145 21:31:23 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:08:45.145 21:31:23 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:08:45.145 21:31:23 -- common/autobuild_common.sh@451 -- $ get_config_params 00:08:45.145 21:31:23 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:08:45.145 21:31:23 -- common/autotest_common.sh@10 -- $ set +x 00:08:45.145 21:31:23 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:08:45.145 21:31:23 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:08:45.145 21:31:23 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:45.145 21:31:23 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:08:45.145 21:31:23 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:08:45.145 21:31:23 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:08:45.145 21:31:23 -- spdk/autopackage.sh@19 -- $ timing_finish 00:08:45.145 21:31:23 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:08:45.145 21:31:23 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:08:45.145 21:31:23 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:45.145 21:31:23 -- spdk/autopackage.sh@20 -- $ exit 0 00:08:45.145 + [[ -n 3463965 ]] 00:08:45.145 + sudo kill 3463965 00:08:45.154 [Pipeline] } 00:08:45.171 [Pipeline] // stage 00:08:45.176 [Pipeline] } 00:08:45.193 [Pipeline] // timeout 00:08:45.198 [Pipeline] } 00:08:45.215 [Pipeline] // catchError 00:08:45.220 [Pipeline] } 00:08:45.237 [Pipeline] // wrap 00:08:45.243 [Pipeline] } 00:08:45.258 [Pipeline] // catchError 00:08:45.265 [Pipeline] stage 00:08:45.267 [Pipeline] { (Epilogue) 00:08:45.278 [Pipeline] catchError 00:08:45.279 [Pipeline] { 00:08:45.288 [Pipeline] echo 00:08:45.289 Cleanup processes 00:08:45.293 [Pipeline] sh 00:08:45.573 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:45.574 3600926 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:45.589 [Pipeline] sh 00:08:45.874 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:45.874 ++ grep -v 'sudo pgrep' 00:08:45.874 ++ awk '{print $1}' 00:08:45.874 + sudo kill -9 00:08:45.874 + true 00:08:45.887 [Pipeline] sh 00:08:46.171 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:08:46.171 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:46.171 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:08:47.122 [Pipeline] sh 00:08:47.406 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:08:47.407 Artifacts sizes are good 00:08:47.421 [Pipeline] archiveArtifacts 00:08:47.428 Archiving artifacts 00:08:47.483 [Pipeline] sh 00:08:47.769 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:08:47.785 [Pipeline] cleanWs 00:08:47.795 [WS-CLEANUP] Deleting project workspace... 00:08:47.795 [WS-CLEANUP] Deferred wipeout is used... 00:08:47.802 [WS-CLEANUP] done 00:08:47.803 [Pipeline] } 00:08:47.825 [Pipeline] // catchError 00:08:47.839 [Pipeline] sh 00:08:48.133 + logger -p user.info -t JENKINS-CI 00:08:48.182 [Pipeline] } 00:08:48.199 [Pipeline] // stage 00:08:48.205 [Pipeline] } 00:08:48.223 [Pipeline] // node 00:08:48.230 [Pipeline] End of Pipeline 00:08:48.265 Finished: SUCCESS