00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 932 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3599 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.012 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.013 The recommended git tool is: git 00:00:00.013 using credential 00000000-0000-0000-0000-000000000002 00:00:00.015 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.031 Fetching changes from the remote Git repository 00:00:00.034 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.056 Using shallow fetch with depth 1 00:00:00.056 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.056 > git --version # timeout=10 00:00:00.080 > git --version # 'git version 2.39.2' 00:00:00.080 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.115 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.115 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.212 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.223 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.236 Checking out Revision 44e7d6069a399ee2647233b387d68a938882e7b7 (FETCH_HEAD) 00:00:02.236 > git config core.sparsecheckout # timeout=10 00:00:02.247 > git read-tree -mu HEAD # timeout=10 00:00:02.263 > git checkout -f 44e7d6069a399ee2647233b387d68a938882e7b7 # timeout=5 00:00:02.282 Commit message: "scripts/bmc: Rework Get NIC Info cmd parser" 00:00:02.282 > git rev-list --no-walk 44e7d6069a399ee2647233b387d68a938882e7b7 # timeout=10 00:00:02.637 [Pipeline] Start of Pipeline 00:00:02.648 [Pipeline] library 00:00:02.650 Loading library shm_lib@master 00:00:02.651 Library shm_lib@master is cached. Copying from home. 00:00:02.666 [Pipeline] node 00:00:02.686 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.688 [Pipeline] { 00:00:02.697 [Pipeline] catchError 00:00:02.699 [Pipeline] { 00:00:02.709 [Pipeline] wrap 00:00:02.718 [Pipeline] { 00:00:02.723 [Pipeline] stage 00:00:02.724 [Pipeline] { (Prologue) 00:00:02.971 [Pipeline] sh 00:00:03.250 + logger -p user.info -t JENKINS-CI 00:00:03.267 [Pipeline] echo 00:00:03.269 Node: WFP20 00:00:03.276 [Pipeline] sh 00:00:03.573 [Pipeline] setCustomBuildProperty 00:00:03.584 [Pipeline] echo 00:00:03.585 Cleanup processes 00:00:03.590 [Pipeline] sh 00:00:03.871 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.871 1017266 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.883 [Pipeline] sh 00:00:04.167 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.167 ++ grep -v 'sudo pgrep' 00:00:04.167 ++ awk '{print $1}' 00:00:04.167 + sudo kill -9 00:00:04.167 + true 00:00:04.179 [Pipeline] cleanWs 00:00:04.187 [WS-CLEANUP] Deleting project workspace... 00:00:04.187 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.193 [WS-CLEANUP] done 00:00:04.197 [Pipeline] setCustomBuildProperty 00:00:04.207 [Pipeline] sh 00:00:04.532 + sudo git config --global --replace-all safe.directory '*' 00:00:04.626 [Pipeline] httpRequest 00:00:05.564 [Pipeline] echo 00:00:05.566 Sorcerer 10.211.164.101 is alive 00:00:05.572 [Pipeline] retry 00:00:05.574 [Pipeline] { 00:00:05.581 [Pipeline] httpRequest 00:00:05.585 HttpMethod: GET 00:00:05.585 URL: http://10.211.164.101/packages/jbp_44e7d6069a399ee2647233b387d68a938882e7b7.tar.gz 00:00:05.586 Sending request to url: http://10.211.164.101/packages/jbp_44e7d6069a399ee2647233b387d68a938882e7b7.tar.gz 00:00:05.587 Response Code: HTTP/1.1 200 OK 00:00:05.588 Success: Status code 200 is in the accepted range: 200,404 00:00:05.588 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_44e7d6069a399ee2647233b387d68a938882e7b7.tar.gz 00:00:06.604 [Pipeline] } 00:00:06.619 [Pipeline] // retry 00:00:06.625 [Pipeline] sh 00:00:06.906 + tar --no-same-owner -xf jbp_44e7d6069a399ee2647233b387d68a938882e7b7.tar.gz 00:00:06.917 [Pipeline] httpRequest 00:00:07.835 [Pipeline] echo 00:00:07.837 Sorcerer 10.211.164.101 is alive 00:00:07.847 [Pipeline] retry 00:00:07.849 [Pipeline] { 00:00:07.863 [Pipeline] httpRequest 00:00:07.867 HttpMethod: GET 00:00:07.868 URL: http://10.211.164.101/packages/spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:00:07.868 Sending request to url: http://10.211.164.101/packages/spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:00:07.889 Response Code: HTTP/1.1 200 OK 00:00:07.890 Success: Status code 200 is in the accepted range: 200,404 00:00:07.890 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:01:23.812 [Pipeline] } 00:01:23.831 [Pipeline] // retry 00:01:23.838 [Pipeline] sh 00:01:24.122 + tar --no-same-owner -xf spdk_726a04d705a30cca40ac8dc8d45f839602005b7a.tar.gz 00:01:26.669 [Pipeline] sh 00:01:26.952 + git -C spdk log --oneline -n5 00:01:26.952 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:26.952 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:26.953 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:26.953 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:26.953 9469ea403 nvme/fio_plugin: add trim support 00:01:26.970 [Pipeline] withCredentials 00:01:26.980 > git --version # timeout=10 00:01:26.993 > git --version # 'git version 2.39.2' 00:01:27.010 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:27.011 [Pipeline] { 00:01:27.020 [Pipeline] retry 00:01:27.022 [Pipeline] { 00:01:27.037 [Pipeline] sh 00:01:27.321 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:27.333 [Pipeline] } 00:01:27.350 [Pipeline] // retry 00:01:27.356 [Pipeline] } 00:01:27.373 [Pipeline] // withCredentials 00:01:27.383 [Pipeline] httpRequest 00:01:27.798 [Pipeline] echo 00:01:27.800 Sorcerer 10.211.164.101 is alive 00:01:27.810 [Pipeline] retry 00:01:27.812 [Pipeline] { 00:01:27.827 [Pipeline] httpRequest 00:01:27.831 HttpMethod: GET 00:01:27.831 URL: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:27.832 Sending request to url: http://10.211.164.101/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:27.834 Response Code: HTTP/1.1 200 OK 00:01:27.835 Success: Status code 200 is in the accepted range: 200,404 00:01:27.835 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:30.335 [Pipeline] } 00:01:30.353 [Pipeline] // retry 00:01:30.361 [Pipeline] sh 00:01:30.646 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:32.035 [Pipeline] sh 00:01:32.319 + git -C dpdk log --oneline -n5 00:01:32.319 caf0f5d395 version: 22.11.4 00:01:32.319 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:32.319 dc9c799c7d vhost: fix missing spinlock unlock 00:01:32.319 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:32.319 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:32.329 [Pipeline] } 00:01:32.342 [Pipeline] // stage 00:01:32.351 [Pipeline] stage 00:01:32.353 [Pipeline] { (Prepare) 00:01:32.372 [Pipeline] writeFile 00:01:32.387 [Pipeline] sh 00:01:32.670 + logger -p user.info -t JENKINS-CI 00:01:32.682 [Pipeline] sh 00:01:32.966 + logger -p user.info -t JENKINS-CI 00:01:32.978 [Pipeline] sh 00:01:33.262 + cat autorun-spdk.conf 00:01:33.262 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:33.262 SPDK_RUN_UBSAN=1 00:01:33.262 SPDK_TEST_FUZZER=1 00:01:33.262 SPDK_TEST_FUZZER_SHORT=1 00:01:33.262 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:33.262 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:33.269 RUN_NIGHTLY=1 00:01:33.274 [Pipeline] readFile 00:01:33.298 [Pipeline] withEnv 00:01:33.300 [Pipeline] { 00:01:33.311 [Pipeline] sh 00:01:33.595 + set -ex 00:01:33.596 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:33.596 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:33.596 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:33.596 ++ SPDK_RUN_UBSAN=1 00:01:33.596 ++ SPDK_TEST_FUZZER=1 00:01:33.596 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:33.596 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:33.596 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:33.596 ++ RUN_NIGHTLY=1 00:01:33.596 + case $SPDK_TEST_NVMF_NICS in 00:01:33.596 + DRIVERS= 00:01:33.596 + [[ -n '' ]] 00:01:33.596 + exit 0 00:01:33.604 [Pipeline] } 00:01:33.618 [Pipeline] // withEnv 00:01:33.624 [Pipeline] } 00:01:33.637 [Pipeline] // stage 00:01:33.646 [Pipeline] catchError 00:01:33.648 [Pipeline] { 00:01:33.661 [Pipeline] timeout 00:01:33.662 Timeout set to expire in 30 min 00:01:33.663 [Pipeline] { 00:01:33.675 [Pipeline] stage 00:01:33.677 [Pipeline] { (Tests) 00:01:33.687 [Pipeline] sh 00:01:33.969 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:33.969 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:33.969 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:33.969 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:33.969 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:33.969 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:33.969 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:33.969 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:33.969 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:33.969 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:33.969 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:33.969 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:33.969 + source /etc/os-release 00:01:33.969 ++ NAME='Fedora Linux' 00:01:33.969 ++ VERSION='39 (Cloud Edition)' 00:01:33.969 ++ ID=fedora 00:01:33.969 ++ VERSION_ID=39 00:01:33.969 ++ VERSION_CODENAME= 00:01:33.969 ++ PLATFORM_ID=platform:f39 00:01:33.969 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:33.969 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:33.969 ++ LOGO=fedora-logo-icon 00:01:33.969 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:33.969 ++ HOME_URL=https://fedoraproject.org/ 00:01:33.969 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:33.969 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:33.969 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:33.969 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:33.969 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:33.969 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:33.969 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:33.969 ++ SUPPORT_END=2024-11-12 00:01:33.969 ++ VARIANT='Cloud Edition' 00:01:33.969 ++ VARIANT_ID=cloud 00:01:33.969 + uname -a 00:01:33.969 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:33.969 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:37.260 Hugepages 00:01:37.260 node hugesize free / total 00:01:37.260 node0 1048576kB 0 / 0 00:01:37.260 node0 2048kB 0 / 0 00:01:37.260 node1 1048576kB 0 / 0 00:01:37.260 node1 2048kB 0 / 0 00:01:37.260 00:01:37.260 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:37.260 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:37.260 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:37.260 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:37.260 + rm -f /tmp/spdk-ld-path 00:01:37.260 + source autorun-spdk.conf 00:01:37.260 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:37.260 ++ SPDK_RUN_UBSAN=1 00:01:37.260 ++ SPDK_TEST_FUZZER=1 00:01:37.260 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:37.260 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:37.260 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.260 ++ RUN_NIGHTLY=1 00:01:37.260 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:37.260 + [[ -n '' ]] 00:01:37.260 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:37.260 + for M in /var/spdk/build-*-manifest.txt 00:01:37.260 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:37.260 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:37.260 + for M in /var/spdk/build-*-manifest.txt 00:01:37.260 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:37.260 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:37.260 + for M in /var/spdk/build-*-manifest.txt 00:01:37.260 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:37.260 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:37.260 ++ uname 00:01:37.260 + [[ Linux == \L\i\n\u\x ]] 00:01:37.260 + sudo dmesg -T 00:01:37.260 + sudo dmesg --clear 00:01:37.260 + dmesg_pid=1018761 00:01:37.260 + [[ Fedora Linux == FreeBSD ]] 00:01:37.260 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:37.260 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:37.260 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:37.260 + [[ -x /usr/src/fio-static/fio ]] 00:01:37.260 + export FIO_BIN=/usr/src/fio-static/fio 00:01:37.260 + FIO_BIN=/usr/src/fio-static/fio 00:01:37.260 + sudo dmesg -Tw 00:01:37.260 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:37.260 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:37.260 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:37.260 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:37.260 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:37.260 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:37.260 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:37.260 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:37.260 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:37.260 Test configuration: 00:01:37.260 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:37.260 SPDK_RUN_UBSAN=1 00:01:37.260 SPDK_TEST_FUZZER=1 00:01:37.260 SPDK_TEST_FUZZER_SHORT=1 00:01:37.260 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:37.260 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.260 RUN_NIGHTLY=1 12:01:24 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:37.260 12:01:24 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:37.260 12:01:24 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:37.260 12:01:24 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:37.260 12:01:24 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:37.260 12:01:24 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:37.260 12:01:24 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:37.260 12:01:24 -- paths/export.sh@5 -- $ export PATH 00:01:37.261 12:01:24 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:37.261 12:01:24 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:37.261 12:01:24 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:37.261 12:01:24 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1730545284.XXXXXX 00:01:37.261 12:01:24 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1730545284.ZGfk9y 00:01:37.261 12:01:24 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.261 12:01:24 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:37.261 12:01:24 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:37.261 12:01:24 -- common/autotest_common.sh@10 -- $ set +x 00:01:37.261 12:01:24 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:37.261 12:01:24 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:37.261 12:01:24 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:37.261 12:01:24 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:37.261 12:01:24 -- spdk/autobuild.sh@16 -- $ date -u 00:01:37.261 Sat Nov 2 11:01:24 AM UTC 2024 00:01:37.261 12:01:24 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:37.261 LTS-66-g726a04d70 00:01:37.261 12:01:24 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:37.261 12:01:24 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:37.261 12:01:24 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:37.261 12:01:24 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:37.261 12:01:24 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:37.261 12:01:24 -- common/autotest_common.sh@10 -- $ set +x 00:01:37.261 ************************************ 00:01:37.261 START TEST ubsan 00:01:37.261 ************************************ 00:01:37.261 12:01:24 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:37.261 using ubsan 00:01:37.261 00:01:37.261 real 0m0.000s 00:01:37.261 user 0m0.000s 00:01:37.261 sys 0m0.000s 00:01:37.261 12:01:24 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:37.261 12:01:24 -- common/autotest_common.sh@10 -- $ set +x 00:01:37.261 ************************************ 00:01:37.261 END TEST ubsan 00:01:37.261 ************************************ 00:01:37.261 12:01:24 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:37.261 12:01:24 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:37.261 12:01:24 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:37.261 12:01:24 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:37.261 12:01:24 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:37.261 12:01:24 -- common/autotest_common.sh@10 -- $ set +x 00:01:37.261 ************************************ 00:01:37.261 START TEST build_native_dpdk 00:01:37.261 ************************************ 00:01:37.261 12:01:24 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:37.261 12:01:24 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:37.261 12:01:24 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:37.261 12:01:24 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:37.261 12:01:24 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:37.261 12:01:24 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:37.261 12:01:24 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:37.261 12:01:24 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:37.261 12:01:24 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:37.261 12:01:24 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:37.261 12:01:24 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:37.261 12:01:24 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:37.261 12:01:24 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:37.261 12:01:24 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.261 12:01:24 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.261 12:01:24 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:37.261 12:01:24 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:37.261 12:01:24 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:37.261 caf0f5d395 version: 22.11.4 00:01:37.261 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:37.261 dc9c799c7d vhost: fix missing spinlock unlock 00:01:37.261 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:37.261 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:37.261 12:01:24 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:37.261 12:01:24 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:37.261 12:01:24 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:37.261 12:01:24 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:37.261 12:01:24 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:37.261 12:01:24 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:37.261 12:01:24 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:37.261 12:01:24 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:37.261 12:01:24 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:37.261 12:01:24 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:37.261 12:01:24 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:37.261 12:01:24 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:37.261 12:01:24 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:37.261 12:01:24 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:37.261 12:01:24 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:37.261 12:01:24 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:37.261 12:01:24 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:37.261 12:01:24 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:37.261 12:01:24 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:37.261 12:01:24 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:37.261 12:01:24 -- scripts/common.sh@343 -- $ case "$op" in 00:01:37.261 12:01:24 -- scripts/common.sh@344 -- $ : 1 00:01:37.261 12:01:24 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:37.261 12:01:24 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:37.261 12:01:24 -- scripts/common.sh@364 -- $ decimal 22 00:01:37.261 12:01:24 -- scripts/common.sh@352 -- $ local d=22 00:01:37.261 12:01:24 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:37.261 12:01:24 -- scripts/common.sh@354 -- $ echo 22 00:01:37.261 12:01:24 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:37.261 12:01:24 -- scripts/common.sh@365 -- $ decimal 21 00:01:37.261 12:01:24 -- scripts/common.sh@352 -- $ local d=21 00:01:37.261 12:01:24 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:37.261 12:01:24 -- scripts/common.sh@354 -- $ echo 21 00:01:37.261 12:01:24 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:37.261 12:01:24 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:37.261 12:01:24 -- scripts/common.sh@366 -- $ return 1 00:01:37.261 12:01:24 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:37.261 patching file config/rte_config.h 00:01:37.261 Hunk #1 succeeded at 60 (offset 1 line). 00:01:37.261 12:01:24 -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:37.261 12:01:24 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:37.261 12:01:24 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:37.261 12:01:24 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:37.261 12:01:24 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:37.261 12:01:24 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:37.261 12:01:24 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:37.261 12:01:24 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:37.261 12:01:24 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:37.261 12:01:24 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:37.261 12:01:24 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:37.261 12:01:24 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:37.261 12:01:24 -- scripts/common.sh@343 -- $ case "$op" in 00:01:37.261 12:01:24 -- scripts/common.sh@344 -- $ : 1 00:01:37.261 12:01:24 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:37.261 12:01:24 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:37.261 12:01:24 -- scripts/common.sh@364 -- $ decimal 22 00:01:37.261 12:01:24 -- scripts/common.sh@352 -- $ local d=22 00:01:37.261 12:01:24 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:37.261 12:01:24 -- scripts/common.sh@354 -- $ echo 22 00:01:37.261 12:01:24 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:37.261 12:01:24 -- scripts/common.sh@365 -- $ decimal 24 00:01:37.261 12:01:24 -- scripts/common.sh@352 -- $ local d=24 00:01:37.261 12:01:24 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:37.261 12:01:24 -- scripts/common.sh@354 -- $ echo 24 00:01:37.261 12:01:24 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:37.261 12:01:24 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:37.261 12:01:24 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:37.262 12:01:24 -- scripts/common.sh@367 -- $ return 0 00:01:37.262 12:01:24 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:37.262 patching file lib/pcapng/rte_pcapng.c 00:01:37.262 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:37.262 12:01:24 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:37.262 12:01:24 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:37.262 12:01:24 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:37.262 12:01:24 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:37.262 12:01:24 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:41.454 The Meson build system 00:01:41.454 Version: 1.5.0 00:01:41.454 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:41.454 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:41.454 Build type: native build 00:01:41.454 Program cat found: YES (/usr/bin/cat) 00:01:41.454 Project name: DPDK 00:01:41.454 Project version: 22.11.4 00:01:41.454 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:41.454 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:41.454 Host machine cpu family: x86_64 00:01:41.454 Host machine cpu: x86_64 00:01:41.454 Message: ## Building in Developer Mode ## 00:01:41.454 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:41.454 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:41.454 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:41.454 Program objdump found: YES (/usr/bin/objdump) 00:01:41.454 Program python3 found: YES (/usr/bin/python3) 00:01:41.454 Program cat found: YES (/usr/bin/cat) 00:01:41.454 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:41.454 Checking for size of "void *" : 8 00:01:41.454 Checking for size of "void *" : 8 (cached) 00:01:41.454 Library m found: YES 00:01:41.454 Library numa found: YES 00:01:41.454 Has header "numaif.h" : YES 00:01:41.454 Library fdt found: NO 00:01:41.454 Library execinfo found: NO 00:01:41.454 Has header "execinfo.h" : YES 00:01:41.454 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:41.454 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:41.454 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:41.454 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:41.454 Run-time dependency openssl found: YES 3.1.1 00:01:41.454 Run-time dependency libpcap found: YES 1.10.4 00:01:41.454 Has header "pcap.h" with dependency libpcap: YES 00:01:41.454 Compiler for C supports arguments -Wcast-qual: YES 00:01:41.454 Compiler for C supports arguments -Wdeprecated: YES 00:01:41.454 Compiler for C supports arguments -Wformat: YES 00:01:41.454 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:41.454 Compiler for C supports arguments -Wformat-security: NO 00:01:41.454 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:41.454 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:41.454 Compiler for C supports arguments -Wnested-externs: YES 00:01:41.454 Compiler for C supports arguments -Wold-style-definition: YES 00:01:41.454 Compiler for C supports arguments -Wpointer-arith: YES 00:01:41.454 Compiler for C supports arguments -Wsign-compare: YES 00:01:41.454 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:41.454 Compiler for C supports arguments -Wundef: YES 00:01:41.454 Compiler for C supports arguments -Wwrite-strings: YES 00:01:41.454 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:41.454 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:41.454 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:41.454 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:41.454 Compiler for C supports arguments -mavx512f: YES 00:01:41.454 Checking if "AVX512 checking" compiles: YES 00:01:41.454 Fetching value of define "__SSE4_2__" : 1 00:01:41.454 Fetching value of define "__AES__" : 1 00:01:41.454 Fetching value of define "__AVX__" : 1 00:01:41.454 Fetching value of define "__AVX2__" : 1 00:01:41.454 Fetching value of define "__AVX512BW__" : 1 00:01:41.454 Fetching value of define "__AVX512CD__" : 1 00:01:41.454 Fetching value of define "__AVX512DQ__" : 1 00:01:41.454 Fetching value of define "__AVX512F__" : 1 00:01:41.454 Fetching value of define "__AVX512VL__" : 1 00:01:41.454 Fetching value of define "__PCLMUL__" : 1 00:01:41.454 Fetching value of define "__RDRND__" : 1 00:01:41.454 Fetching value of define "__RDSEED__" : 1 00:01:41.454 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:41.454 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:41.454 Message: lib/kvargs: Defining dependency "kvargs" 00:01:41.454 Message: lib/telemetry: Defining dependency "telemetry" 00:01:41.454 Checking for function "getentropy" : YES 00:01:41.454 Message: lib/eal: Defining dependency "eal" 00:01:41.454 Message: lib/ring: Defining dependency "ring" 00:01:41.454 Message: lib/rcu: Defining dependency "rcu" 00:01:41.454 Message: lib/mempool: Defining dependency "mempool" 00:01:41.454 Message: lib/mbuf: Defining dependency "mbuf" 00:01:41.454 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:41.454 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:41.454 Compiler for C supports arguments -mpclmul: YES 00:01:41.454 Compiler for C supports arguments -maes: YES 00:01:41.454 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:41.454 Compiler for C supports arguments -mavx512bw: YES 00:01:41.454 Compiler for C supports arguments -mavx512dq: YES 00:01:41.454 Compiler for C supports arguments -mavx512vl: YES 00:01:41.454 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:41.454 Compiler for C supports arguments -mavx2: YES 00:01:41.454 Compiler for C supports arguments -mavx: YES 00:01:41.454 Message: lib/net: Defining dependency "net" 00:01:41.454 Message: lib/meter: Defining dependency "meter" 00:01:41.454 Message: lib/ethdev: Defining dependency "ethdev" 00:01:41.454 Message: lib/pci: Defining dependency "pci" 00:01:41.454 Message: lib/cmdline: Defining dependency "cmdline" 00:01:41.454 Message: lib/metrics: Defining dependency "metrics" 00:01:41.454 Message: lib/hash: Defining dependency "hash" 00:01:41.454 Message: lib/timer: Defining dependency "timer" 00:01:41.454 Fetching value of define "__AVX2__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:41.454 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.455 Message: lib/acl: Defining dependency "acl" 00:01:41.455 Message: lib/bbdev: Defining dependency "bbdev" 00:01:41.455 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:41.455 Run-time dependency libelf found: YES 0.191 00:01:41.455 Message: lib/bpf: Defining dependency "bpf" 00:01:41.455 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:41.455 Message: lib/compressdev: Defining dependency "compressdev" 00:01:41.455 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:41.455 Message: lib/distributor: Defining dependency "distributor" 00:01:41.455 Message: lib/efd: Defining dependency "efd" 00:01:41.455 Message: lib/eventdev: Defining dependency "eventdev" 00:01:41.455 Message: lib/gpudev: Defining dependency "gpudev" 00:01:41.455 Message: lib/gro: Defining dependency "gro" 00:01:41.455 Message: lib/gso: Defining dependency "gso" 00:01:41.455 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:41.455 Message: lib/jobstats: Defining dependency "jobstats" 00:01:41.455 Message: lib/latencystats: Defining dependency "latencystats" 00:01:41.455 Message: lib/lpm: Defining dependency "lpm" 00:01:41.455 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.455 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:41.455 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:41.455 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:41.455 Message: lib/member: Defining dependency "member" 00:01:41.455 Message: lib/pcapng: Defining dependency "pcapng" 00:01:41.455 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:41.455 Message: lib/power: Defining dependency "power" 00:01:41.455 Message: lib/rawdev: Defining dependency "rawdev" 00:01:41.455 Message: lib/regexdev: Defining dependency "regexdev" 00:01:41.455 Message: lib/dmadev: Defining dependency "dmadev" 00:01:41.455 Message: lib/rib: Defining dependency "rib" 00:01:41.455 Message: lib/reorder: Defining dependency "reorder" 00:01:41.455 Message: lib/sched: Defining dependency "sched" 00:01:41.455 Message: lib/security: Defining dependency "security" 00:01:41.455 Message: lib/stack: Defining dependency "stack" 00:01:41.455 Has header "linux/userfaultfd.h" : YES 00:01:41.455 Message: lib/vhost: Defining dependency "vhost" 00:01:41.455 Message: lib/ipsec: Defining dependency "ipsec" 00:01:41.455 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.455 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:41.455 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.455 Message: lib/fib: Defining dependency "fib" 00:01:41.455 Message: lib/port: Defining dependency "port" 00:01:41.455 Message: lib/pdump: Defining dependency "pdump" 00:01:41.455 Message: lib/table: Defining dependency "table" 00:01:41.455 Message: lib/pipeline: Defining dependency "pipeline" 00:01:41.455 Message: lib/graph: Defining dependency "graph" 00:01:41.455 Message: lib/node: Defining dependency "node" 00:01:41.455 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:41.455 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:41.455 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:41.455 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:41.455 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:41.455 Compiler for C supports arguments -Wno-unused-value: YES 00:01:41.455 Compiler for C supports arguments -Wno-format: YES 00:01:41.455 Compiler for C supports arguments -Wno-format-security: YES 00:01:41.455 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:42.402 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:42.402 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:42.402 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:42.402 Fetching value of define "__AVX2__" : 1 (cached) 00:01:42.402 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:42.402 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:42.402 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:42.402 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:42.402 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:42.402 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:42.402 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:42.402 Configuring doxy-api.conf using configuration 00:01:42.402 Program sphinx-build found: NO 00:01:42.402 Configuring rte_build_config.h using configuration 00:01:42.402 Message: 00:01:42.402 ================= 00:01:42.402 Applications Enabled 00:01:42.402 ================= 00:01:42.402 00:01:42.402 apps: 00:01:42.402 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:42.402 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:42.402 test-security-perf, 00:01:42.402 00:01:42.402 Message: 00:01:42.402 ================= 00:01:42.402 Libraries Enabled 00:01:42.402 ================= 00:01:42.402 00:01:42.402 libs: 00:01:42.402 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:42.402 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:42.402 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:42.402 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:42.402 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:42.402 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:42.402 table, pipeline, graph, node, 00:01:42.402 00:01:42.402 Message: 00:01:42.402 =============== 00:01:42.402 Drivers Enabled 00:01:42.402 =============== 00:01:42.402 00:01:42.402 common: 00:01:42.402 00:01:42.402 bus: 00:01:42.402 pci, vdev, 00:01:42.402 mempool: 00:01:42.402 ring, 00:01:42.402 dma: 00:01:42.402 00:01:42.402 net: 00:01:42.402 i40e, 00:01:42.402 raw: 00:01:42.402 00:01:42.402 crypto: 00:01:42.402 00:01:42.402 compress: 00:01:42.402 00:01:42.402 regex: 00:01:42.402 00:01:42.402 vdpa: 00:01:42.402 00:01:42.402 event: 00:01:42.402 00:01:42.402 baseband: 00:01:42.402 00:01:42.402 gpu: 00:01:42.402 00:01:42.402 00:01:42.402 Message: 00:01:42.402 ================= 00:01:42.402 Content Skipped 00:01:42.402 ================= 00:01:42.402 00:01:42.402 apps: 00:01:42.402 00:01:42.402 libs: 00:01:42.402 kni: explicitly disabled via build config (deprecated lib) 00:01:42.402 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:42.402 00:01:42.402 drivers: 00:01:42.402 common/cpt: not in enabled drivers build config 00:01:42.402 common/dpaax: not in enabled drivers build config 00:01:42.402 common/iavf: not in enabled drivers build config 00:01:42.402 common/idpf: not in enabled drivers build config 00:01:42.402 common/mvep: not in enabled drivers build config 00:01:42.402 common/octeontx: not in enabled drivers build config 00:01:42.402 bus/auxiliary: not in enabled drivers build config 00:01:42.402 bus/dpaa: not in enabled drivers build config 00:01:42.402 bus/fslmc: not in enabled drivers build config 00:01:42.402 bus/ifpga: not in enabled drivers build config 00:01:42.402 bus/vmbus: not in enabled drivers build config 00:01:42.403 common/cnxk: not in enabled drivers build config 00:01:42.403 common/mlx5: not in enabled drivers build config 00:01:42.403 common/qat: not in enabled drivers build config 00:01:42.403 common/sfc_efx: not in enabled drivers build config 00:01:42.403 mempool/bucket: not in enabled drivers build config 00:01:42.403 mempool/cnxk: not in enabled drivers build config 00:01:42.403 mempool/dpaa: not in enabled drivers build config 00:01:42.403 mempool/dpaa2: not in enabled drivers build config 00:01:42.403 mempool/octeontx: not in enabled drivers build config 00:01:42.403 mempool/stack: not in enabled drivers build config 00:01:42.403 dma/cnxk: not in enabled drivers build config 00:01:42.403 dma/dpaa: not in enabled drivers build config 00:01:42.403 dma/dpaa2: not in enabled drivers build config 00:01:42.403 dma/hisilicon: not in enabled drivers build config 00:01:42.403 dma/idxd: not in enabled drivers build config 00:01:42.403 dma/ioat: not in enabled drivers build config 00:01:42.403 dma/skeleton: not in enabled drivers build config 00:01:42.403 net/af_packet: not in enabled drivers build config 00:01:42.403 net/af_xdp: not in enabled drivers build config 00:01:42.403 net/ark: not in enabled drivers build config 00:01:42.403 net/atlantic: not in enabled drivers build config 00:01:42.403 net/avp: not in enabled drivers build config 00:01:42.403 net/axgbe: not in enabled drivers build config 00:01:42.403 net/bnx2x: not in enabled drivers build config 00:01:42.403 net/bnxt: not in enabled drivers build config 00:01:42.403 net/bonding: not in enabled drivers build config 00:01:42.403 net/cnxk: not in enabled drivers build config 00:01:42.403 net/cxgbe: not in enabled drivers build config 00:01:42.403 net/dpaa: not in enabled drivers build config 00:01:42.403 net/dpaa2: not in enabled drivers build config 00:01:42.403 net/e1000: not in enabled drivers build config 00:01:42.403 net/ena: not in enabled drivers build config 00:01:42.403 net/enetc: not in enabled drivers build config 00:01:42.403 net/enetfec: not in enabled drivers build config 00:01:42.403 net/enic: not in enabled drivers build config 00:01:42.403 net/failsafe: not in enabled drivers build config 00:01:42.403 net/fm10k: not in enabled drivers build config 00:01:42.403 net/gve: not in enabled drivers build config 00:01:42.403 net/hinic: not in enabled drivers build config 00:01:42.403 net/hns3: not in enabled drivers build config 00:01:42.403 net/iavf: not in enabled drivers build config 00:01:42.403 net/ice: not in enabled drivers build config 00:01:42.403 net/idpf: not in enabled drivers build config 00:01:42.403 net/igc: not in enabled drivers build config 00:01:42.403 net/ionic: not in enabled drivers build config 00:01:42.403 net/ipn3ke: not in enabled drivers build config 00:01:42.403 net/ixgbe: not in enabled drivers build config 00:01:42.403 net/kni: not in enabled drivers build config 00:01:42.403 net/liquidio: not in enabled drivers build config 00:01:42.403 net/mana: not in enabled drivers build config 00:01:42.403 net/memif: not in enabled drivers build config 00:01:42.403 net/mlx4: not in enabled drivers build config 00:01:42.403 net/mlx5: not in enabled drivers build config 00:01:42.403 net/mvneta: not in enabled drivers build config 00:01:42.403 net/mvpp2: not in enabled drivers build config 00:01:42.403 net/netvsc: not in enabled drivers build config 00:01:42.403 net/nfb: not in enabled drivers build config 00:01:42.403 net/nfp: not in enabled drivers build config 00:01:42.403 net/ngbe: not in enabled drivers build config 00:01:42.403 net/null: not in enabled drivers build config 00:01:42.403 net/octeontx: not in enabled drivers build config 00:01:42.403 net/octeon_ep: not in enabled drivers build config 00:01:42.403 net/pcap: not in enabled drivers build config 00:01:42.403 net/pfe: not in enabled drivers build config 00:01:42.403 net/qede: not in enabled drivers build config 00:01:42.403 net/ring: not in enabled drivers build config 00:01:42.403 net/sfc: not in enabled drivers build config 00:01:42.403 net/softnic: not in enabled drivers build config 00:01:42.403 net/tap: not in enabled drivers build config 00:01:42.403 net/thunderx: not in enabled drivers build config 00:01:42.403 net/txgbe: not in enabled drivers build config 00:01:42.403 net/vdev_netvsc: not in enabled drivers build config 00:01:42.403 net/vhost: not in enabled drivers build config 00:01:42.403 net/virtio: not in enabled drivers build config 00:01:42.403 net/vmxnet3: not in enabled drivers build config 00:01:42.403 raw/cnxk_bphy: not in enabled drivers build config 00:01:42.403 raw/cnxk_gpio: not in enabled drivers build config 00:01:42.403 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:42.403 raw/ifpga: not in enabled drivers build config 00:01:42.403 raw/ntb: not in enabled drivers build config 00:01:42.403 raw/skeleton: not in enabled drivers build config 00:01:42.403 crypto/armv8: not in enabled drivers build config 00:01:42.403 crypto/bcmfs: not in enabled drivers build config 00:01:42.403 crypto/caam_jr: not in enabled drivers build config 00:01:42.403 crypto/ccp: not in enabled drivers build config 00:01:42.403 crypto/cnxk: not in enabled drivers build config 00:01:42.403 crypto/dpaa_sec: not in enabled drivers build config 00:01:42.403 crypto/dpaa2_sec: not in enabled drivers build config 00:01:42.403 crypto/ipsec_mb: not in enabled drivers build config 00:01:42.403 crypto/mlx5: not in enabled drivers build config 00:01:42.403 crypto/mvsam: not in enabled drivers build config 00:01:42.403 crypto/nitrox: not in enabled drivers build config 00:01:42.403 crypto/null: not in enabled drivers build config 00:01:42.403 crypto/octeontx: not in enabled drivers build config 00:01:42.403 crypto/openssl: not in enabled drivers build config 00:01:42.403 crypto/scheduler: not in enabled drivers build config 00:01:42.403 crypto/uadk: not in enabled drivers build config 00:01:42.403 crypto/virtio: not in enabled drivers build config 00:01:42.403 compress/isal: not in enabled drivers build config 00:01:42.403 compress/mlx5: not in enabled drivers build config 00:01:42.403 compress/octeontx: not in enabled drivers build config 00:01:42.403 compress/zlib: not in enabled drivers build config 00:01:42.403 regex/mlx5: not in enabled drivers build config 00:01:42.403 regex/cn9k: not in enabled drivers build config 00:01:42.403 vdpa/ifc: not in enabled drivers build config 00:01:42.403 vdpa/mlx5: not in enabled drivers build config 00:01:42.403 vdpa/sfc: not in enabled drivers build config 00:01:42.403 event/cnxk: not in enabled drivers build config 00:01:42.403 event/dlb2: not in enabled drivers build config 00:01:42.403 event/dpaa: not in enabled drivers build config 00:01:42.403 event/dpaa2: not in enabled drivers build config 00:01:42.403 event/dsw: not in enabled drivers build config 00:01:42.403 event/opdl: not in enabled drivers build config 00:01:42.403 event/skeleton: not in enabled drivers build config 00:01:42.403 event/sw: not in enabled drivers build config 00:01:42.403 event/octeontx: not in enabled drivers build config 00:01:42.403 baseband/acc: not in enabled drivers build config 00:01:42.403 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:42.403 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:42.403 baseband/la12xx: not in enabled drivers build config 00:01:42.403 baseband/null: not in enabled drivers build config 00:01:42.403 baseband/turbo_sw: not in enabled drivers build config 00:01:42.403 gpu/cuda: not in enabled drivers build config 00:01:42.403 00:01:42.403 00:01:42.403 Build targets in project: 311 00:01:42.403 00:01:42.403 DPDK 22.11.4 00:01:42.403 00:01:42.403 User defined options 00:01:42.403 libdir : lib 00:01:42.403 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:42.403 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:42.403 c_link_args : 00:01:42.403 enable_docs : false 00:01:42.403 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:42.403 enable_kmods : false 00:01:42.403 machine : native 00:01:42.403 tests : false 00:01:42.403 00:01:42.403 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:42.403 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:42.403 12:01:29 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:42.403 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:42.403 [1/740] Generating lib/rte_kvargs_def with a custom command 00:01:42.403 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:42.403 [3/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:42.403 [4/740] Generating lib/rte_telemetry_def with a custom command 00:01:42.665 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:42.665 [6/740] Generating lib/rte_ring_def with a custom command 00:01:42.665 [7/740] Generating lib/rte_eal_def with a custom command 00:01:42.665 [8/740] Generating lib/rte_eal_mingw with a custom command 00:01:42.665 [9/740] Generating lib/rte_ring_mingw with a custom command 00:01:42.665 [10/740] Generating lib/rte_rcu_def with a custom command 00:01:42.665 [11/740] Generating lib/rte_mempool_def with a custom command 00:01:42.665 [12/740] Generating lib/rte_rcu_mingw with a custom command 00:01:42.665 [13/740] Generating lib/rte_mbuf_def with a custom command 00:01:42.665 [14/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:42.665 [15/740] Generating lib/rte_mempool_mingw with a custom command 00:01:42.665 [16/740] Generating lib/rte_net_mingw with a custom command 00:01:42.665 [17/740] Generating lib/rte_net_def with a custom command 00:01:42.665 [18/740] Generating lib/rte_meter_def with a custom command 00:01:42.665 [19/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:42.665 [20/740] Generating lib/rte_meter_mingw with a custom command 00:01:42.665 [21/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:42.665 [22/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:42.665 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:42.665 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:42.665 [25/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:42.665 [26/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:42.665 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:42.665 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:42.665 [29/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:42.665 [30/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:42.665 [31/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:42.665 [32/740] Generating lib/rte_pci_def with a custom command 00:01:42.665 [33/740] Generating lib/rte_pci_mingw with a custom command 00:01:42.665 [34/740] Generating lib/rte_ethdev_def with a custom command 00:01:42.665 [35/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:42.665 [36/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:42.665 [37/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:42.665 [38/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:42.665 [39/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:42.665 [40/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:42.665 [41/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:42.665 [42/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:42.666 [43/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:42.666 [44/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:42.666 [45/740] Generating lib/rte_cmdline_def with a custom command 00:01:42.666 [46/740] Linking static target lib/librte_kvargs.a 00:01:42.666 [47/740] Generating lib/rte_metrics_mingw with a custom command 00:01:42.666 [48/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:42.666 [49/740] Generating lib/rte_metrics_def with a custom command 00:01:42.666 [50/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:42.666 [51/740] Generating lib/rte_hash_def with a custom command 00:01:42.666 [52/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:42.666 [53/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:42.666 [54/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:42.666 [55/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:42.666 [56/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:42.666 [57/740] Generating lib/rte_hash_mingw with a custom command 00:01:42.666 [58/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:42.666 [59/740] Generating lib/rte_timer_mingw with a custom command 00:01:42.666 [60/740] Generating lib/rte_timer_def with a custom command 00:01:42.666 [61/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:42.666 [62/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:42.666 [63/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:42.666 [64/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:42.666 [65/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:42.666 [66/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:42.666 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:42.928 [68/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:42.928 [69/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:42.928 [70/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:42.928 [71/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:42.928 [72/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:42.928 [73/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:42.928 [74/740] Generating lib/rte_acl_def with a custom command 00:01:42.928 [75/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:42.928 [76/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:42.928 [77/740] Generating lib/rte_acl_mingw with a custom command 00:01:42.928 [78/740] Generating lib/rte_bbdev_def with a custom command 00:01:42.928 [79/740] Generating lib/rte_bitratestats_def with a custom command 00:01:42.928 [80/740] Linking static target lib/librte_pci.a 00:01:42.928 [81/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:42.928 [82/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:42.928 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:42.928 [84/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:42.928 [85/740] Linking static target lib/librte_meter.a 00:01:42.928 [86/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:42.928 [87/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:42.928 [88/740] Generating lib/rte_bpf_def with a custom command 00:01:42.928 [89/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:42.928 [90/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:42.928 [91/740] Generating lib/rte_bpf_mingw with a custom command 00:01:42.928 [92/740] Generating lib/rte_cfgfile_def with a custom command 00:01:42.928 [93/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:42.928 [94/740] Generating lib/rte_compressdev_def with a custom command 00:01:42.928 [95/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:42.928 [96/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:42.928 [97/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:42.928 [98/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:42.928 [99/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:42.928 [100/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:42.928 [101/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:42.928 [102/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:42.928 [103/740] Linking static target lib/librte_ring.a 00:01:42.928 [104/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:42.928 [105/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:42.928 [106/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:42.928 [107/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:42.928 [108/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:42.928 [109/740] Generating lib/rte_cryptodev_def with a custom command 00:01:42.928 [110/740] Generating lib/rte_distributor_mingw with a custom command 00:01:42.928 [111/740] Generating lib/rte_distributor_def with a custom command 00:01:42.928 [112/740] Generating lib/rte_efd_def with a custom command 00:01:42.928 [113/740] Generating lib/rte_efd_mingw with a custom command 00:01:42.928 [114/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:42.928 [115/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:42.928 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:42.928 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:42.928 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:42.928 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:42.928 [120/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:42.928 [121/740] Generating lib/rte_eventdev_def with a custom command 00:01:42.928 [122/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:42.928 [123/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:42.928 [124/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:42.928 [125/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:42.928 [126/740] Generating lib/rte_gpudev_def with a custom command 00:01:42.928 [127/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:42.928 [128/740] Generating lib/rte_gro_mingw with a custom command 00:01:42.928 [129/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:42.928 [130/740] Generating lib/rte_gro_def with a custom command 00:01:42.928 [131/740] Generating lib/rte_gso_def with a custom command 00:01:42.928 [132/740] Generating lib/rte_gso_mingw with a custom command 00:01:43.191 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:43.191 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:43.191 [135/740] Generating lib/rte_ip_frag_def with a custom command 00:01:43.191 [136/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:43.191 [137/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.191 [138/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:43.191 [139/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.191 [140/740] Generating lib/rte_jobstats_def with a custom command 00:01:43.191 [141/740] Linking target lib/librte_kvargs.so.23.0 00:01:43.191 [142/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:43.191 [143/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:43.191 [144/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:43.191 [145/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:43.191 [146/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.191 [147/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:43.191 [148/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:43.191 [149/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:43.191 [150/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:43.191 [151/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:43.191 [152/740] Linking static target lib/librte_cfgfile.a 00:01:43.191 [153/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:43.191 [154/740] Generating lib/rte_latencystats_def with a custom command 00:01:43.191 [155/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:43.191 [156/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:43.191 [157/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:43.191 [158/740] Generating lib/rte_lpm_def with a custom command 00:01:43.451 [159/740] Generating lib/rte_lpm_mingw with a custom command 00:01:43.451 [160/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:43.451 [161/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:43.451 [162/740] Generating lib/rte_member_mingw with a custom command 00:01:43.451 [163/740] Generating lib/rte_member_def with a custom command 00:01:43.451 [164/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:43.451 [165/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:43.451 [166/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:43.451 [167/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:43.451 [168/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:43.451 [169/740] Linking static target lib/librte_jobstats.a 00:01:43.451 [170/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:43.451 [171/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:43.451 [172/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.451 [173/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:43.451 [174/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:43.451 [175/740] Generating lib/rte_pcapng_def with a custom command 00:01:43.451 [176/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:43.451 [177/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:43.451 [178/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:43.451 [179/740] Linking static target lib/librte_telemetry.a 00:01:43.451 [180/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:43.451 [181/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:43.451 [182/740] Linking static target lib/librte_cmdline.a 00:01:43.451 [183/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:43.451 [184/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:43.451 [185/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:43.451 [186/740] Generating lib/rte_power_mingw with a custom command 00:01:43.451 [187/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:43.451 [188/740] Generating lib/rte_power_def with a custom command 00:01:43.451 [189/740] Linking static target lib/librte_timer.a 00:01:43.451 [190/740] Generating lib/rte_rawdev_def with a custom command 00:01:43.451 [191/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:43.451 [192/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:43.451 [193/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:43.451 [194/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:43.451 [195/740] Linking static target lib/librte_metrics.a 00:01:43.451 [196/740] Generating lib/rte_regexdev_def with a custom command 00:01:43.451 [197/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:43.451 [198/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:43.451 [199/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:43.451 [200/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:43.451 [201/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:43.451 [202/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:43.451 [203/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:43.451 [204/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:43.451 [205/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:43.451 [206/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:43.451 [207/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:43.451 [208/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:43.451 [209/740] Generating lib/rte_rib_mingw with a custom command 00:01:43.451 [210/740] Generating lib/rte_rib_def with a custom command 00:01:43.451 [211/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:43.451 [212/740] Linking static target lib/librte_net.a 00:01:43.451 [213/740] Generating lib/rte_dmadev_def with a custom command 00:01:43.451 [214/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:43.451 [215/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:43.451 [216/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:43.451 [217/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:43.451 [218/740] Linking static target lib/librte_bitratestats.a 00:01:43.451 [219/740] Generating lib/rte_reorder_mingw with a custom command 00:01:43.451 [220/740] Generating lib/rte_reorder_def with a custom command 00:01:43.451 [221/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:43.451 [222/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:43.451 [223/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:43.451 [224/740] Generating lib/rte_security_mingw with a custom command 00:01:43.451 [225/740] Generating lib/rte_stack_def with a custom command 00:01:43.451 [226/740] Generating lib/rte_sched_def with a custom command 00:01:43.451 [227/740] Generating lib/rte_stack_mingw with a custom command 00:01:43.451 [228/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:43.451 [229/740] Generating lib/rte_sched_mingw with a custom command 00:01:43.711 [230/740] Generating lib/rte_security_def with a custom command 00:01:43.711 [231/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:43.711 [232/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:43.711 [233/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:43.711 [234/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:43.711 [235/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:43.711 [236/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:43.711 [237/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:43.711 [238/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:43.711 [239/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:43.711 [240/740] Generating lib/rte_vhost_def with a custom command 00:01:43.711 [241/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:43.711 [242/740] Generating lib/rte_vhost_mingw with a custom command 00:01:43.711 [243/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:43.711 [244/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:43.711 [245/740] Generating lib/rte_ipsec_def with a custom command 00:01:43.711 [246/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:43.711 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:43.711 [248/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:43.711 [249/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:43.711 [250/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:43.712 [251/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:43.712 [252/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:43.712 [253/740] Generating lib/rte_fib_def with a custom command 00:01:43.712 [254/740] Generating lib/rte_fib_mingw with a custom command 00:01:43.712 [255/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:43.712 [256/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:43.712 [257/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:43.712 [258/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:43.712 [259/740] Linking static target lib/librte_stack.a 00:01:43.712 [260/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:43.712 [261/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:43.712 [262/740] Generating lib/rte_port_def with a custom command 00:01:43.712 [263/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:43.712 [264/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:43.712 [265/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:43.712 [266/740] Generating lib/rte_port_mingw with a custom command 00:01:43.712 [267/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:43.712 [268/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:43.712 [269/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:43.712 [270/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.712 [271/740] Generating lib/rte_pdump_mingw with a custom command 00:01:43.712 [272/740] Generating lib/rte_pdump_def with a custom command 00:01:43.712 [273/740] Linking static target lib/librte_compressdev.a 00:01:43.712 [274/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:43.712 [275/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:43.712 [276/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:43.973 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:43.973 [278/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:43.973 [279/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.973 [280/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:43.973 [281/740] Linking static target lib/librte_rcu.a 00:01:43.973 [282/740] Linking static target lib/librte_mempool.a 00:01:43.973 [283/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:43.973 [284/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.973 [285/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:43.973 [286/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:43.973 [287/740] Linking static target lib/librte_bbdev.a 00:01:43.973 [288/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:43.973 [289/740] Linking static target lib/librte_rawdev.a 00:01:43.973 [290/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:43.973 [291/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.973 [292/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:43.973 [293/740] Generating lib/rte_table_def with a custom command 00:01:43.973 [294/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:43.973 [295/740] Generating lib/rte_table_mingw with a custom command 00:01:43.973 [296/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:43.973 [297/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.973 [298/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:43.973 [299/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:43.973 [300/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:43.973 [301/740] Linking static target lib/librte_gpudev.a 00:01:43.973 [302/740] Linking static target lib/librte_gro.a 00:01:43.973 [303/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:43.973 [304/740] Linking static target lib/librte_dmadev.a 00:01:43.973 [305/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:43.973 [306/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:43.973 [307/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.973 [308/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.973 [309/740] Generating lib/rte_pipeline_def with a custom command 00:01:43.973 [310/740] Linking target lib/librte_telemetry.so.23.0 00:01:43.973 [311/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:43.973 [312/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:43.973 [313/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:43.973 [314/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:43.973 [315/740] Linking static target lib/librte_gso.a 00:01:43.973 [316/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:43.973 [317/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.973 [318/740] Linking static target lib/librte_latencystats.a 00:01:44.236 [319/740] Generating lib/rte_graph_def with a custom command 00:01:44.236 [320/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:44.236 [321/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:44.236 [322/740] Generating lib/rte_graph_mingw with a custom command 00:01:44.236 [323/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:44.236 [324/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:44.236 [325/740] Linking static target lib/librte_distributor.a 00:01:44.236 [326/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:44.236 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:44.236 [328/740] Linking static target lib/librte_ip_frag.a 00:01:44.236 [329/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:44.236 [330/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:44.236 [331/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:44.236 [332/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:44.236 [333/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:44.236 [334/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:44.236 [335/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:44.236 [336/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:44.236 [337/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:44.236 [338/740] Linking static target lib/librte_regexdev.a 00:01:44.236 [339/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:44.236 [340/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:44.236 [341/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:44.236 [342/740] Generating lib/rte_node_mingw with a custom command 00:01:44.236 [343/740] Generating lib/rte_node_def with a custom command 00:01:44.236 [344/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:44.236 [345/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.236 [346/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.236 [347/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.500 [348/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:44.500 [349/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:44.500 [350/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:44.500 [351/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:44.500 [352/740] Linking static target lib/librte_reorder.a 00:01:44.500 [353/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:44.500 [354/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:44.500 [355/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:44.500 [356/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:44.500 [357/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:44.500 [358/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:44.500 [359/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:44.500 [360/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:44.500 [361/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.500 [362/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:44.500 [363/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:44.500 [364/740] Linking static target lib/librte_eal.a 00:01:44.500 [365/740] Linking static target lib/librte_power.a 00:01:44.500 [366/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:44.500 [367/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:44.500 [368/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:44.500 [369/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:44.500 [370/740] Linking static target lib/librte_pcapng.a 00:01:44.500 [371/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:44.500 [372/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:44.500 [373/740] Linking static target lib/librte_security.a 00:01:44.500 [374/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:44.500 [375/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:44.500 [376/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:44.500 [377/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:44.500 [378/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:44.500 [379/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.500 [380/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:44.500 [381/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:44.500 [382/740] Linking static target lib/librte_mbuf.a 00:01:44.500 [383/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:44.500 [384/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:44.500 [385/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.500 [386/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:44.761 [387/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:44.761 [388/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:44.761 [389/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:44.761 [390/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:44.761 [391/740] Linking static target lib/librte_bpf.a 00:01:44.761 [392/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:44.761 [393/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.761 [394/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:44.761 [395/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:44.761 [396/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:44.761 [397/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:44.761 [398/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:44.761 [399/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:44.761 [400/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:44.761 [401/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:44.761 [402/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:44.761 [403/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:44.761 [404/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:44.761 [405/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:44.761 [406/740] Linking static target lib/librte_rib.a 00:01:44.761 [407/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:44.761 [408/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:44.761 [409/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.761 [410/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:44.761 [411/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:44.761 [412/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:44.761 [413/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.761 [414/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:44.761 [415/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:44.761 [416/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:44.761 [417/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:44.761 [418/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:44.761 [419/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:44.761 [420/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:44.761 [421/740] Linking static target lib/librte_lpm.a 00:01:44.761 [422/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.761 [423/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:44.761 [424/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:44.761 [425/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:45.022 [426/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:45.022 [427/740] Linking static target lib/librte_graph.a 00:01:45.022 [428/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:45.022 [429/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:45.022 [430/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:45.022 [431/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:45.022 [432/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:45.022 [433/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:45.022 [434/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:45.022 [435/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:45.022 [436/740] Linking static target lib/librte_efd.a 00:01:45.022 [437/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.022 [438/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:45.022 [439/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.022 [440/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:45.022 [441/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:45.022 [442/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:45.022 [443/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:45.023 [444/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:45.023 [445/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:45.023 [446/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:45.023 [447/740] Linking static target drivers/librte_bus_vdev.a 00:01:45.023 [448/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:45.023 [449/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:45.023 [450/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:45.023 [451/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:45.023 [452/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:45.023 [453/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.023 [454/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:45.284 [455/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:45.284 [456/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:45.284 [457/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.284 [458/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:45.284 [459/740] Linking static target lib/librte_fib.a 00:01:45.284 [460/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.284 [461/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:45.284 [462/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.284 [463/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.284 [464/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.545 [465/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:45.545 [466/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:45.545 [467/740] Linking static target lib/librte_pdump.a 00:01:45.545 [468/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:45.545 [469/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.545 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:45.545 [471/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:45.545 [472/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.545 [473/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:45.546 [474/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.546 [475/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.546 [476/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.546 [477/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:45.546 [478/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:45.546 [479/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:45.546 [480/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:45.546 [481/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:45.546 [482/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:45.546 [483/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:45.546 [484/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:45.546 [485/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:45.546 [486/740] Linking static target drivers/librte_bus_pci.a 00:01:45.546 [487/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:45.546 [488/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:45.805 [489/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:45.805 [490/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:45.805 [491/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:45.805 [492/740] Linking static target lib/librte_table.a 00:01:45.805 [493/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:45.805 [494/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:45.805 [495/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:45.805 [496/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:45.805 [497/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:45.805 [498/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:45.805 [499/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:45.805 [500/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:45.805 [501/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.805 [502/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:45.805 [503/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:45.805 [504/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:45.805 [505/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:45.805 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:45.805 [507/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:45.805 [508/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.805 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:45.805 [510/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:45.805 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:46.063 [512/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:46.063 [513/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:46.063 [514/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:46.063 [515/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:46.063 [516/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:46.063 [517/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:46.063 [518/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:46.063 [519/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.063 [520/740] Linking static target lib/librte_cryptodev.a 00:01:46.063 [521/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:46.063 [522/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:46.063 [523/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:46.063 [524/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:46.063 [525/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:46.063 [526/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:46.063 [527/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:46.063 [528/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:46.063 [529/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:46.063 [530/740] Linking static target lib/librte_sched.a 00:01:46.063 [531/740] Linking static target lib/librte_node.a 00:01:46.063 [532/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:46.063 [533/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.063 [534/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:46.063 [535/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:46.063 [536/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:46.063 [537/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:46.063 [538/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:46.063 [539/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:46.063 [540/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:46.063 [541/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:46.063 [542/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:46.321 [543/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:46.321 [544/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:46.321 [545/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.321 [546/740] Linking static target lib/librte_ipsec.a 00:01:46.321 [547/740] Linking static target lib/librte_ethdev.a 00:01:46.321 [548/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:46.321 [549/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:46.321 [550/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:46.321 [551/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:46.321 [552/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:46.321 [553/740] Linking static target drivers/librte_mempool_ring.a 00:01:46.321 [554/740] Linking static target lib/librte_member.a 00:01:46.321 [555/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:46.321 [556/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:46.321 [557/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:46.321 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:46.321 [559/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:46.321 [560/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:46.321 [561/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:46.321 [562/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:46.321 [563/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:46.321 [564/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.321 [565/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:46.321 [566/740] Linking static target lib/librte_port.a 00:01:46.321 [567/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:46.321 [568/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:46.321 [569/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:46.321 [570/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:46.321 [571/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:46.321 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:46.321 [573/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:46.321 [574/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:46.579 [575/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:46.579 [576/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:46.579 [577/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:46.579 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:46.579 [579/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:46.579 [580/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:46.579 [581/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:46.579 [582/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:46.579 [583/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:46.579 [584/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:46.579 [585/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:46.579 [586/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:46.579 [587/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:46.579 [588/740] Linking static target lib/librte_hash.a 00:01:46.579 [589/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.579 [590/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.579 [591/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:46.579 [592/740] Linking static target lib/librte_eventdev.a 00:01:46.579 [593/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:46.836 [594/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.836 [595/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:46.836 [596/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.836 [597/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:46.836 [598/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:46.836 [599/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:46.836 [600/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:46.836 [601/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:46.836 [602/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:46.836 [603/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:46.836 [604/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:47.094 [605/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:47.094 [606/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:47.094 [607/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:47.094 [608/740] Linking static target lib/librte_acl.a 00:01:47.094 [609/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:47.094 [610/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:47.094 [611/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:47.094 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:47.352 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.352 [614/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.610 [615/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:47.610 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:47.610 [617/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:47.868 [618/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:48.126 [619/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:48.126 [620/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:48.384 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:48.950 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:48.950 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:49.209 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:49.209 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:49.209 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:49.209 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:49.776 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:49.776 [629/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.048 [630/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.048 [631/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:50.048 [632/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:50.386 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.660 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.660 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:55.919 [636/740] Linking static target lib/librte_vhost.a 00:01:56.485 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:56.485 [638/740] Linking static target lib/librte_pipeline.a 00:01:57.052 [639/740] Linking target app/dpdk-test-acl 00:01:57.052 [640/740] Linking target app/dpdk-proc-info 00:01:57.052 [641/740] Linking target app/dpdk-test-cmdline 00:01:57.052 [642/740] Linking target app/dpdk-test-fib 00:01:57.052 [643/740] Linking target app/dpdk-test-gpudev 00:01:57.052 [644/740] Linking target app/dpdk-test-crypto-perf 00:01:57.052 [645/740] Linking target app/dpdk-pdump 00:01:57.052 [646/740] Linking target app/dpdk-test-bbdev 00:01:57.052 [647/740] Linking target app/dpdk-test-flow-perf 00:01:57.052 [648/740] Linking target app/dpdk-test-security-perf 00:01:57.052 [649/740] Linking target app/dpdk-test-eventdev 00:01:57.052 [650/740] Linking target app/dpdk-dumpcap 00:01:57.052 [651/740] Linking target app/dpdk-test-sad 00:01:57.052 [652/740] Linking target app/dpdk-test-regex 00:01:57.052 [653/740] Linking target app/dpdk-test-compress-perf 00:01:57.052 [654/740] Linking target app/dpdk-test-pipeline 00:01:57.052 [655/740] Linking target app/dpdk-testpmd 00:01:57.989 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.249 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.507 [658/740] Linking target lib/librte_eal.so.23.0 00:01:58.507 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:01:58.507 [660/740] Linking target lib/librte_timer.so.23.0 00:01:58.507 [661/740] Linking target lib/librte_dmadev.so.23.0 00:01:58.507 [662/740] Linking target lib/librte_ring.so.23.0 00:01:58.507 [663/740] Linking target lib/librte_rawdev.so.23.0 00:01:58.507 [664/740] Linking target lib/librte_meter.so.23.0 00:01:58.507 [665/740] Linking target lib/librte_cfgfile.so.23.0 00:01:58.507 [666/740] Linking target lib/librte_pci.so.23.0 00:01:58.507 [667/740] Linking target lib/librte_jobstats.so.23.0 00:01:58.507 [668/740] Linking target lib/librte_stack.so.23.0 00:01:58.507 [669/740] Linking target lib/librte_graph.so.23.0 00:01:58.507 [670/740] Linking target drivers/librte_bus_vdev.so.23.0 00:01:58.507 [671/740] Linking target lib/librte_acl.so.23.0 00:01:58.766 [672/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:01:58.766 [673/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:01:58.766 [674/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:01:58.766 [675/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:01:58.766 [676/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:01:58.766 [677/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:01:58.766 [678/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:01:58.766 [679/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:01:58.766 [680/740] Linking target lib/librte_rcu.so.23.0 00:01:58.766 [681/740] Linking target lib/librte_mempool.so.23.0 00:01:58.766 [682/740] Linking target drivers/librte_bus_pci.so.23.0 00:01:58.766 [683/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:01:59.024 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:01:59.024 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:01:59.024 [686/740] Linking target drivers/librte_mempool_ring.so.23.0 00:01:59.024 [687/740] Linking target lib/librte_mbuf.so.23.0 00:01:59.024 [688/740] Linking target lib/librte_rib.so.23.0 00:01:59.024 [689/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:01:59.024 [690/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:01:59.024 [691/740] Linking target lib/librte_compressdev.so.23.0 00:01:59.024 [692/740] Linking target lib/librte_distributor.so.23.0 00:01:59.024 [693/740] Linking target lib/librte_net.so.23.0 00:01:59.024 [694/740] Linking target lib/librte_bbdev.so.23.0 00:01:59.024 [695/740] Linking target lib/librte_gpudev.so.23.0 00:01:59.024 [696/740] Linking target lib/librte_reorder.so.23.0 00:01:59.024 [697/740] Linking target lib/librte_regexdev.so.23.0 00:01:59.024 [698/740] Linking target lib/librte_cryptodev.so.23.0 00:01:59.282 [699/740] Linking target lib/librte_sched.so.23.0 00:01:59.282 [700/740] Linking target lib/librte_fib.so.23.0 00:01:59.282 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:01:59.282 [702/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:01:59.282 [703/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:01:59.282 [704/740] Linking target lib/librte_cmdline.so.23.0 00:01:59.282 [705/740] Linking target lib/librte_security.so.23.0 00:01:59.282 [706/740] Linking target lib/librte_hash.so.23.0 00:01:59.282 [707/740] Linking target lib/librte_ethdev.so.23.0 00:01:59.541 [708/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:01:59.541 [709/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:01:59.541 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:01:59.541 [711/740] Linking target lib/librte_lpm.so.23.0 00:01:59.541 [712/740] Linking target lib/librte_eventdev.so.23.0 00:01:59.541 [713/740] Linking target lib/librte_efd.so.23.0 00:01:59.541 [714/740] Linking target lib/librte_member.so.23.0 00:01:59.541 [715/740] Linking target lib/librte_gro.so.23.0 00:01:59.541 [716/740] Linking target lib/librte_gso.so.23.0 00:01:59.541 [717/740] Linking target lib/librte_ipsec.so.23.0 00:01:59.541 [718/740] Linking target lib/librte_pcapng.so.23.0 00:01:59.541 [719/740] Linking target lib/librte_metrics.so.23.0 00:01:59.541 [720/740] Linking target lib/librte_ip_frag.so.23.0 00:01:59.541 [721/740] Linking target lib/librte_power.so.23.0 00:01:59.541 [722/740] Linking target lib/librte_bpf.so.23.0 00:01:59.541 [723/740] Linking target lib/librte_vhost.so.23.0 00:01:59.541 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:01:59.541 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:01:59.799 [726/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:01:59.799 [727/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:01:59.799 [728/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:01:59.799 [729/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:01:59.799 [730/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:01:59.799 [731/740] Linking target lib/librte_node.so.23.0 00:01:59.799 [732/740] Linking target lib/librte_bitratestats.so.23.0 00:01:59.799 [733/740] Linking target lib/librte_pdump.so.23.0 00:01:59.799 [734/740] Linking target lib/librte_latencystats.so.23.0 00:01:59.799 [735/740] Linking target lib/librte_port.so.23.0 00:01:59.799 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:00.058 [737/740] Linking target lib/librte_table.so.23.0 00:02:00.058 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:01.434 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.434 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:01.434 12:01:48 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:01.434 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:01.434 [0/1] Installing files. 00:02:01.696 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.696 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.697 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.698 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.699 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.700 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.701 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:01.702 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:01.962 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:01.962 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:01.962 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:01.962 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.962 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.963 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.963 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.963 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:01.963 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:01.963 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:01.963 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.963 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.963 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.963 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.963 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.963 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.964 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:01.965 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.226 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.227 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:02.228 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:02.228 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:02.228 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:02.228 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:02.228 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:02.228 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:02.228 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:02.228 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:02.228 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:02.228 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:02.228 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:02.228 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:02.228 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:02.228 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:02.228 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:02.228 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:02.228 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:02.228 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:02.228 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:02.228 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:02.228 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:02.228 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:02.228 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:02.228 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:02.228 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:02.228 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:02.228 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:02.228 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:02.228 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:02.228 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:02.228 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:02.228 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:02.228 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:02.228 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:02.228 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:02.228 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:02.229 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:02.229 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:02.229 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:02.229 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:02.229 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:02.229 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:02.229 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:02.229 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:02.229 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:02.229 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:02.229 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:02.229 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:02.229 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:02.229 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:02.229 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:02.229 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:02.229 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:02.229 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:02.229 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:02.229 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:02.229 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:02.229 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:02.229 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:02.229 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:02.229 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:02.229 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:02.229 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:02.229 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:02.229 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:02.229 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:02.229 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:02.229 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:02.229 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:02.229 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:02.229 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:02.229 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:02.229 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:02.229 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:02.229 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:02.229 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:02.229 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:02.229 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:02.229 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:02.229 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:02.229 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:02.229 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:02.229 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:02.229 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:02.229 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:02.229 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:02.229 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:02.229 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:02.229 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:02.229 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:02.229 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:02.229 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:02.229 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:02.229 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:02.229 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:02.229 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:02.229 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:02.229 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:02.229 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:02.229 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:02.229 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:02.229 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:02.229 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:02.229 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:02.229 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:02.229 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:02.229 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:02.229 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:02.229 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:02.229 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:02.229 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:02.229 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:02.229 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:02.229 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:02.229 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:02.229 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:02.229 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:02.229 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:02.229 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:02.229 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:02.229 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:02.229 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:02.229 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:02.229 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:02.229 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:02.229 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:02.229 12:01:49 -- common/autobuild_common.sh@192 -- $ uname -s 00:02:02.229 12:01:49 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:02.229 12:01:49 -- common/autobuild_common.sh@203 -- $ cat 00:02:02.229 12:01:49 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:02.229 00:02:02.229 real 0m24.875s 00:02:02.229 user 6m35.399s 00:02:02.229 sys 2m15.673s 00:02:02.229 12:01:49 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:02.229 12:01:49 -- common/autotest_common.sh@10 -- $ set +x 00:02:02.229 ************************************ 00:02:02.229 END TEST build_native_dpdk 00:02:02.229 ************************************ 00:02:02.229 12:01:49 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:02.230 12:01:49 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:02.230 12:01:49 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:02.230 12:01:49 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:02.230 12:01:49 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:02.230 12:01:49 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:02:02.230 12:01:49 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:02.230 12:01:49 -- common/autotest_common.sh@10 -- $ set +x 00:02:02.230 ************************************ 00:02:02.230 START TEST autobuild_llvm_precompile 00:02:02.230 ************************************ 00:02:02.230 12:01:49 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:02:02.230 12:01:49 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:02.230 12:01:49 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:02.230 Target: x86_64-redhat-linux-gnu 00:02:02.230 Thread model: posix 00:02:02.230 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:02.230 12:01:49 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:02.230 12:01:49 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:02.230 12:01:49 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:02.230 12:01:49 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:02.230 12:01:49 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:02.230 12:01:49 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:02.230 12:01:49 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:02.230 12:01:49 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:02.230 12:01:49 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:02.230 12:01:49 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:02.489 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:02.489 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:02.489 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:02.747 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:03.006 Using 'verbs' RDMA provider 00:02:18.466 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:30.683 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:31.251 Creating mk/config.mk...done. 00:02:31.251 Creating mk/cc.flags.mk...done. 00:02:31.251 Type 'make' to build. 00:02:31.251 00:02:31.251 real 0m28.870s 00:02:31.251 user 0m12.716s 00:02:31.251 sys 0m15.554s 00:02:31.251 12:02:17 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:31.251 12:02:17 -- common/autotest_common.sh@10 -- $ set +x 00:02:31.251 ************************************ 00:02:31.251 END TEST autobuild_llvm_precompile 00:02:31.251 ************************************ 00:02:31.251 12:02:17 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:31.251 12:02:17 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:31.251 12:02:17 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:31.251 12:02:17 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:31.251 12:02:17 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:31.251 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:31.510 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:31.510 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:31.510 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:32.078 Using 'verbs' RDMA provider 00:02:44.855 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:57.065 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:57.065 Creating mk/config.mk...done. 00:02:57.065 Creating mk/cc.flags.mk...done. 00:02:57.065 Type 'make' to build. 00:02:57.065 12:02:42 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:57.065 12:02:42 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:02:57.065 12:02:42 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:57.065 12:02:42 -- common/autotest_common.sh@10 -- $ set +x 00:02:57.065 ************************************ 00:02:57.065 START TEST make 00:02:57.065 ************************************ 00:02:57.065 12:02:42 -- common/autotest_common.sh@1104 -- $ make -j112 00:02:57.065 make[1]: Nothing to be done for 'all'. 00:02:58.002 The Meson build system 00:02:58.002 Version: 1.5.0 00:02:58.002 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:58.002 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:58.002 Build type: native build 00:02:58.002 Project name: libvfio-user 00:02:58.002 Project version: 0.0.1 00:02:58.002 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:58.002 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:58.002 Host machine cpu family: x86_64 00:02:58.002 Host machine cpu: x86_64 00:02:58.002 Run-time dependency threads found: YES 00:02:58.002 Library dl found: YES 00:02:58.002 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:58.002 Run-time dependency json-c found: YES 0.17 00:02:58.002 Run-time dependency cmocka found: YES 1.1.7 00:02:58.002 Program pytest-3 found: NO 00:02:58.002 Program flake8 found: NO 00:02:58.002 Program misspell-fixer found: NO 00:02:58.002 Program restructuredtext-lint found: NO 00:02:58.002 Program valgrind found: YES (/usr/bin/valgrind) 00:02:58.002 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:58.002 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:58.002 Compiler for C supports arguments -Wwrite-strings: YES 00:02:58.002 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:58.002 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:58.002 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:58.002 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:58.002 Build targets in project: 8 00:02:58.002 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:58.002 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:58.002 00:02:58.002 libvfio-user 0.0.1 00:02:58.002 00:02:58.002 User defined options 00:02:58.003 buildtype : debug 00:02:58.003 default_library: static 00:02:58.003 libdir : /usr/local/lib 00:02:58.003 00:02:58.003 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:58.261 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:58.520 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:58.520 [2/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:58.520 [3/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:58.520 [4/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:58.520 [5/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:58.520 [6/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:58.520 [7/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:58.520 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:58.520 [9/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:58.520 [10/36] Compiling C object samples/null.p/null.c.o 00:02:58.520 [11/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:58.520 [12/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:58.520 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:58.520 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:58.520 [15/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:58.520 [16/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:58.520 [17/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:58.520 [18/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:58.520 [19/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:58.520 [20/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:58.520 [21/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:58.520 [22/36] Compiling C object samples/server.p/server.c.o 00:02:58.520 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:58.520 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:58.520 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:58.520 [26/36] Compiling C object samples/client.p/client.c.o 00:02:58.520 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:58.520 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:58.520 [29/36] Linking static target lib/libvfio-user.a 00:02:58.520 [30/36] Linking target samples/client 00:02:58.520 [31/36] Linking target samples/server 00:02:58.520 [32/36] Linking target samples/null 00:02:58.520 [33/36] Linking target samples/gpio-pci-idio-16 00:02:58.520 [34/36] Linking target samples/shadow_ioeventfd_server 00:02:58.520 [35/36] Linking target test/unit_tests 00:02:58.520 [36/36] Linking target samples/lspci 00:02:58.520 INFO: autodetecting backend as ninja 00:02:58.520 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:58.520 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:59.090 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:59.090 ninja: no work to do. 00:03:02.497 CC lib/ut/ut.o 00:03:02.497 CC lib/ut_mock/mock.o 00:03:02.497 CC lib/log/log.o 00:03:02.497 CC lib/log/log_flags.o 00:03:02.497 CC lib/log/log_deprecated.o 00:03:02.497 LIB libspdk_ut_mock.a 00:03:02.497 LIB libspdk_ut.a 00:03:02.497 LIB libspdk_log.a 00:03:02.497 CC lib/util/base64.o 00:03:02.497 CC lib/util/bit_array.o 00:03:02.497 CC lib/util/cpuset.o 00:03:02.497 CC lib/util/crc16.o 00:03:02.497 CC lib/util/crc32.o 00:03:02.497 CC lib/util/crc64.o 00:03:02.497 CC lib/util/crc32c.o 00:03:02.497 CC lib/util/crc32_ieee.o 00:03:02.497 CC lib/util/dif.o 00:03:02.497 CC lib/util/iov.o 00:03:02.497 CC lib/util/fd.o 00:03:02.497 CC lib/util/file.o 00:03:02.497 CC lib/util/hexlify.o 00:03:02.497 CC lib/util/pipe.o 00:03:02.497 CC lib/util/math.o 00:03:02.497 CC lib/util/xor.o 00:03:02.497 CC lib/util/strerror_tls.o 00:03:02.497 CC lib/util/string.o 00:03:02.497 CC lib/util/uuid.o 00:03:02.497 CC lib/util/fd_group.o 00:03:02.497 CC lib/util/zipf.o 00:03:02.497 CC lib/dma/dma.o 00:03:02.497 CXX lib/trace_parser/trace.o 00:03:02.497 CC lib/ioat/ioat.o 00:03:02.497 CC lib/vfio_user/host/vfio_user_pci.o 00:03:02.497 CC lib/vfio_user/host/vfio_user.o 00:03:02.497 LIB libspdk_dma.a 00:03:02.756 LIB libspdk_ioat.a 00:03:02.756 LIB libspdk_util.a 00:03:02.756 LIB libspdk_vfio_user.a 00:03:03.014 LIB libspdk_trace_parser.a 00:03:03.014 CC lib/vmd/vmd.o 00:03:03.014 CC lib/vmd/led.o 00:03:03.014 CC lib/json/json_write.o 00:03:03.014 CC lib/json/json_parse.o 00:03:03.014 CC lib/json/json_util.o 00:03:03.014 CC lib/conf/conf.o 00:03:03.014 CC lib/rdma/common.o 00:03:03.014 CC lib/rdma/rdma_verbs.o 00:03:03.014 CC lib/idxd/idxd.o 00:03:03.014 CC lib/idxd/idxd_user.o 00:03:03.014 CC lib/idxd/idxd_kernel.o 00:03:03.014 CC lib/env_dpdk/memory.o 00:03:03.014 CC lib/env_dpdk/env.o 00:03:03.014 CC lib/env_dpdk/pci.o 00:03:03.014 CC lib/env_dpdk/pci_ioat.o 00:03:03.014 CC lib/env_dpdk/init.o 00:03:03.014 CC lib/env_dpdk/threads.o 00:03:03.014 CC lib/env_dpdk/pci_virtio.o 00:03:03.014 CC lib/env_dpdk/pci_event.o 00:03:03.014 CC lib/env_dpdk/pci_vmd.o 00:03:03.014 CC lib/env_dpdk/pci_idxd.o 00:03:03.014 CC lib/env_dpdk/sigbus_handler.o 00:03:03.014 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:03.014 CC lib/env_dpdk/pci_dpdk.o 00:03:03.014 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:03.273 LIB libspdk_conf.a 00:03:03.273 LIB libspdk_json.a 00:03:03.273 LIB libspdk_rdma.a 00:03:03.273 LIB libspdk_idxd.a 00:03:03.273 LIB libspdk_vmd.a 00:03:03.533 CC lib/jsonrpc/jsonrpc_server.o 00:03:03.533 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:03.533 CC lib/jsonrpc/jsonrpc_client.o 00:03:03.533 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:03.792 LIB libspdk_jsonrpc.a 00:03:04.051 LIB libspdk_env_dpdk.a 00:03:04.051 CC lib/rpc/rpc.o 00:03:04.051 LIB libspdk_rpc.a 00:03:04.311 CC lib/trace/trace.o 00:03:04.311 CC lib/trace/trace_rpc.o 00:03:04.311 CC lib/trace/trace_flags.o 00:03:04.311 CC lib/notify/notify.o 00:03:04.570 CC lib/notify/notify_rpc.o 00:03:04.570 CC lib/sock/sock.o 00:03:04.570 CC lib/sock/sock_rpc.o 00:03:04.570 LIB libspdk_notify.a 00:03:04.570 LIB libspdk_trace.a 00:03:04.570 LIB libspdk_sock.a 00:03:04.830 CC lib/thread/thread.o 00:03:04.830 CC lib/thread/iobuf.o 00:03:05.089 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:05.089 CC lib/nvme/nvme_ctrlr.o 00:03:05.089 CC lib/nvme/nvme_ns.o 00:03:05.089 CC lib/nvme/nvme_fabric.o 00:03:05.089 CC lib/nvme/nvme_ns_cmd.o 00:03:05.089 CC lib/nvme/nvme_pcie.o 00:03:05.089 CC lib/nvme/nvme.o 00:03:05.089 CC lib/nvme/nvme_pcie_common.o 00:03:05.089 CC lib/nvme/nvme_qpair.o 00:03:05.089 CC lib/nvme/nvme_transport.o 00:03:05.089 CC lib/nvme/nvme_quirks.o 00:03:05.089 CC lib/nvme/nvme_discovery.o 00:03:05.089 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:05.089 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:05.089 CC lib/nvme/nvme_io_msg.o 00:03:05.089 CC lib/nvme/nvme_tcp.o 00:03:05.089 CC lib/nvme/nvme_opal.o 00:03:05.089 CC lib/nvme/nvme_poll_group.o 00:03:05.089 CC lib/nvme/nvme_zns.o 00:03:05.089 CC lib/nvme/nvme_vfio_user.o 00:03:05.089 CC lib/nvme/nvme_cuse.o 00:03:05.089 CC lib/nvme/nvme_rdma.o 00:03:05.655 LIB libspdk_thread.a 00:03:05.913 CC lib/vfu_tgt/tgt_endpoint.o 00:03:05.913 CC lib/vfu_tgt/tgt_rpc.o 00:03:05.913 CC lib/init/json_config.o 00:03:05.913 CC lib/init/subsystem_rpc.o 00:03:05.913 CC lib/init/subsystem.o 00:03:05.913 CC lib/init/rpc.o 00:03:05.913 CC lib/blob/blobstore.o 00:03:05.913 CC lib/blob/request.o 00:03:05.913 CC lib/blob/zeroes.o 00:03:05.913 CC lib/blob/blob_bs_dev.o 00:03:05.913 CC lib/virtio/virtio_vhost_user.o 00:03:05.913 CC lib/virtio/virtio.o 00:03:05.913 CC lib/accel/accel_sw.o 00:03:05.913 CC lib/accel/accel.o 00:03:05.913 CC lib/accel/accel_rpc.o 00:03:05.913 CC lib/virtio/virtio_vfio_user.o 00:03:05.913 CC lib/virtio/virtio_pci.o 00:03:06.171 LIB libspdk_init.a 00:03:06.171 LIB libspdk_vfu_tgt.a 00:03:06.171 LIB libspdk_virtio.a 00:03:06.171 LIB libspdk_nvme.a 00:03:06.430 CC lib/event/app.o 00:03:06.430 CC lib/event/reactor.o 00:03:06.430 CC lib/event/log_rpc.o 00:03:06.430 CC lib/event/app_rpc.o 00:03:06.430 CC lib/event/scheduler_static.o 00:03:06.689 LIB libspdk_accel.a 00:03:06.689 LIB libspdk_event.a 00:03:06.947 CC lib/bdev/bdev.o 00:03:06.947 CC lib/bdev/part.o 00:03:06.947 CC lib/bdev/bdev_rpc.o 00:03:06.947 CC lib/bdev/bdev_zone.o 00:03:06.947 CC lib/bdev/scsi_nvme.o 00:03:07.514 LIB libspdk_blob.a 00:03:07.773 CC lib/blobfs/blobfs.o 00:03:07.773 CC lib/blobfs/tree.o 00:03:07.773 CC lib/lvol/lvol.o 00:03:08.340 LIB libspdk_lvol.a 00:03:08.340 LIB libspdk_blobfs.a 00:03:08.599 LIB libspdk_bdev.a 00:03:08.857 CC lib/nvmf/ctrlr.o 00:03:08.857 CC lib/nvmf/ctrlr_discovery.o 00:03:08.857 CC lib/nvmf/nvmf.o 00:03:08.857 CC lib/nvmf/ctrlr_bdev.o 00:03:08.857 CC lib/nvmf/subsystem.o 00:03:08.857 CC lib/nvmf/nvmf_rpc.o 00:03:08.857 CC lib/scsi/dev.o 00:03:08.857 CC lib/nvmf/rdma.o 00:03:08.857 CC lib/nvmf/transport.o 00:03:08.857 CC lib/nvmf/tcp.o 00:03:08.857 CC lib/nvmf/vfio_user.o 00:03:08.857 CC lib/scsi/lun.o 00:03:08.857 CC lib/scsi/scsi_bdev.o 00:03:08.857 CC lib/scsi/port.o 00:03:08.857 CC lib/scsi/scsi.o 00:03:08.857 CC lib/scsi/task.o 00:03:08.857 CC lib/scsi/scsi_pr.o 00:03:08.857 CC lib/scsi/scsi_rpc.o 00:03:08.857 CC lib/ublk/ublk.o 00:03:08.857 CC lib/ublk/ublk_rpc.o 00:03:08.857 CC lib/nbd/nbd.o 00:03:08.857 CC lib/nbd/nbd_rpc.o 00:03:08.857 CC lib/ftl/ftl_core.o 00:03:08.857 CC lib/ftl/ftl_debug.o 00:03:08.857 CC lib/ftl/ftl_init.o 00:03:08.857 CC lib/ftl/ftl_layout.o 00:03:08.857 CC lib/ftl/ftl_io.o 00:03:08.857 CC lib/ftl/ftl_sb.o 00:03:08.857 CC lib/ftl/ftl_l2p.o 00:03:08.857 CC lib/ftl/ftl_l2p_flat.o 00:03:08.857 CC lib/ftl/ftl_nv_cache.o 00:03:08.857 CC lib/ftl/ftl_band.o 00:03:08.857 CC lib/ftl/ftl_band_ops.o 00:03:08.857 CC lib/ftl/ftl_writer.o 00:03:08.857 CC lib/ftl/ftl_rq.o 00:03:08.857 CC lib/ftl/ftl_reloc.o 00:03:08.857 CC lib/ftl/ftl_l2p_cache.o 00:03:08.857 CC lib/ftl/ftl_p2l.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:08.857 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:08.857 CC lib/ftl/utils/ftl_md.o 00:03:08.857 CC lib/ftl/utils/ftl_conf.o 00:03:08.857 CC lib/ftl/utils/ftl_mempool.o 00:03:08.857 CC lib/ftl/utils/ftl_bitmap.o 00:03:08.857 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:08.857 CC lib/ftl/utils/ftl_property.o 00:03:08.857 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:08.857 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:08.857 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:08.857 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:08.857 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:08.857 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:08.857 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:08.857 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:08.857 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:08.857 CC lib/ftl/base/ftl_base_dev.o 00:03:08.857 CC lib/ftl/base/ftl_base_bdev.o 00:03:08.857 CC lib/ftl/ftl_trace.o 00:03:09.115 LIB libspdk_nbd.a 00:03:09.115 LIB libspdk_scsi.a 00:03:09.115 LIB libspdk_ublk.a 00:03:09.374 LIB libspdk_ftl.a 00:03:09.374 CC lib/vhost/vhost.o 00:03:09.374 CC lib/vhost/vhost_rpc.o 00:03:09.374 CC lib/vhost/rte_vhost_user.o 00:03:09.374 CC lib/vhost/vhost_scsi.o 00:03:09.374 CC lib/vhost/vhost_blk.o 00:03:09.374 CC lib/iscsi/conn.o 00:03:09.374 CC lib/iscsi/init_grp.o 00:03:09.374 CC lib/iscsi/iscsi.o 00:03:09.374 CC lib/iscsi/md5.o 00:03:09.374 CC lib/iscsi/iscsi_subsystem.o 00:03:09.374 CC lib/iscsi/param.o 00:03:09.374 CC lib/iscsi/portal_grp.o 00:03:09.374 CC lib/iscsi/tgt_node.o 00:03:09.374 CC lib/iscsi/iscsi_rpc.o 00:03:09.374 CC lib/iscsi/task.o 00:03:09.942 LIB libspdk_nvmf.a 00:03:09.942 LIB libspdk_vhost.a 00:03:10.202 LIB libspdk_iscsi.a 00:03:10.770 CC module/env_dpdk/env_dpdk_rpc.o 00:03:10.770 CC module/vfu_device/vfu_virtio.o 00:03:10.770 CC module/vfu_device/vfu_virtio_blk.o 00:03:10.770 CC module/vfu_device/vfu_virtio_scsi.o 00:03:10.770 CC module/vfu_device/vfu_virtio_rpc.o 00:03:10.770 LIB libspdk_env_dpdk_rpc.a 00:03:10.770 CC module/blob/bdev/blob_bdev.o 00:03:10.770 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:10.770 CC module/accel/iaa/accel_iaa.o 00:03:10.770 CC module/accel/iaa/accel_iaa_rpc.o 00:03:10.770 CC module/accel/dsa/accel_dsa.o 00:03:10.770 CC module/sock/posix/posix.o 00:03:10.770 CC module/accel/dsa/accel_dsa_rpc.o 00:03:10.770 CC module/scheduler/gscheduler/gscheduler.o 00:03:10.770 CC module/accel/error/accel_error.o 00:03:10.770 CC module/accel/error/accel_error_rpc.o 00:03:10.770 CC module/accel/ioat/accel_ioat.o 00:03:10.770 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:10.770 CC module/accel/ioat/accel_ioat_rpc.o 00:03:10.770 LIB libspdk_scheduler_dpdk_governor.a 00:03:11.029 LIB libspdk_scheduler_gscheduler.a 00:03:11.029 LIB libspdk_accel_iaa.a 00:03:11.029 LIB libspdk_blob_bdev.a 00:03:11.029 LIB libspdk_accel_error.a 00:03:11.029 LIB libspdk_scheduler_dynamic.a 00:03:11.029 LIB libspdk_accel_ioat.a 00:03:11.029 LIB libspdk_accel_dsa.a 00:03:11.029 LIB libspdk_vfu_device.a 00:03:11.288 LIB libspdk_sock_posix.a 00:03:11.288 CC module/bdev/error/vbdev_error.o 00:03:11.288 CC module/bdev/error/vbdev_error_rpc.o 00:03:11.288 CC module/bdev/gpt/gpt.o 00:03:11.288 CC module/bdev/gpt/vbdev_gpt.o 00:03:11.288 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:11.288 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:11.288 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:11.288 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:11.288 CC module/bdev/nvme/bdev_nvme.o 00:03:11.288 CC module/bdev/lvol/vbdev_lvol.o 00:03:11.288 CC module/bdev/iscsi/bdev_iscsi.o 00:03:11.288 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:11.288 CC module/bdev/nvme/nvme_rpc.o 00:03:11.288 CC module/bdev/nvme/vbdev_opal.o 00:03:11.288 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:11.288 CC module/bdev/nvme/bdev_mdns_client.o 00:03:11.288 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:11.288 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:11.288 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:11.288 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:11.288 CC module/bdev/ftl/bdev_ftl.o 00:03:11.288 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:11.288 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:11.288 CC module/bdev/delay/vbdev_delay.o 00:03:11.288 CC module/bdev/aio/bdev_aio_rpc.o 00:03:11.288 CC module/bdev/aio/bdev_aio.o 00:03:11.288 CC module/bdev/passthru/vbdev_passthru.o 00:03:11.288 CC module/bdev/split/vbdev_split_rpc.o 00:03:11.288 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:11.288 CC module/bdev/split/vbdev_split.o 00:03:11.288 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:11.288 CC module/blobfs/bdev/blobfs_bdev.o 00:03:11.288 CC module/bdev/malloc/bdev_malloc.o 00:03:11.288 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:11.288 CC module/bdev/raid/bdev_raid.o 00:03:11.288 CC module/bdev/raid/bdev_raid_rpc.o 00:03:11.288 CC module/bdev/raid/bdev_raid_sb.o 00:03:11.288 CC module/bdev/null/bdev_null.o 00:03:11.288 CC module/bdev/raid/concat.o 00:03:11.288 CC module/bdev/raid/raid0.o 00:03:11.288 CC module/bdev/raid/raid1.o 00:03:11.288 CC module/bdev/null/bdev_null_rpc.o 00:03:11.548 LIB libspdk_bdev_error.a 00:03:11.548 LIB libspdk_bdev_gpt.a 00:03:11.548 LIB libspdk_bdev_split.a 00:03:11.548 LIB libspdk_bdev_ftl.a 00:03:11.548 LIB libspdk_bdev_null.a 00:03:11.548 LIB libspdk_blobfs_bdev.a 00:03:11.548 LIB libspdk_bdev_passthru.a 00:03:11.548 LIB libspdk_bdev_zone_block.a 00:03:11.548 LIB libspdk_bdev_aio.a 00:03:11.548 LIB libspdk_bdev_iscsi.a 00:03:11.548 LIB libspdk_bdev_delay.a 00:03:11.548 LIB libspdk_bdev_lvol.a 00:03:11.807 LIB libspdk_bdev_virtio.a 00:03:11.807 LIB libspdk_bdev_malloc.a 00:03:11.807 LIB libspdk_bdev_raid.a 00:03:12.743 LIB libspdk_bdev_nvme.a 00:03:13.002 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:13.002 CC module/event/subsystems/sock/sock.o 00:03:13.002 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:13.002 CC module/event/subsystems/iobuf/iobuf.o 00:03:13.002 CC module/event/subsystems/vmd/vmd.o 00:03:13.002 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:13.002 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:13.002 CC module/event/subsystems/scheduler/scheduler.o 00:03:13.260 LIB libspdk_event_sock.a 00:03:13.260 LIB libspdk_event_vhost_blk.a 00:03:13.260 LIB libspdk_event_vfu_tgt.a 00:03:13.260 LIB libspdk_event_scheduler.a 00:03:13.260 LIB libspdk_event_vmd.a 00:03:13.260 LIB libspdk_event_iobuf.a 00:03:13.519 CC module/event/subsystems/accel/accel.o 00:03:13.519 LIB libspdk_event_accel.a 00:03:14.086 CC module/event/subsystems/bdev/bdev.o 00:03:14.086 LIB libspdk_event_bdev.a 00:03:14.344 CC module/event/subsystems/ublk/ublk.o 00:03:14.344 CC module/event/subsystems/nbd/nbd.o 00:03:14.344 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:14.344 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:14.344 CC module/event/subsystems/scsi/scsi.o 00:03:14.344 LIB libspdk_event_ublk.a 00:03:14.344 LIB libspdk_event_nbd.a 00:03:14.603 LIB libspdk_event_scsi.a 00:03:14.603 LIB libspdk_event_nvmf.a 00:03:14.861 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:14.861 CC module/event/subsystems/iscsi/iscsi.o 00:03:14.861 LIB libspdk_event_vhost_scsi.a 00:03:14.861 LIB libspdk_event_iscsi.a 00:03:15.120 CC test/rpc_client/rpc_client_test.o 00:03:15.120 CXX app/trace/trace.o 00:03:15.120 TEST_HEADER include/spdk/accel.h 00:03:15.120 CC app/trace_record/trace_record.o 00:03:15.120 CC app/spdk_nvme_perf/perf.o 00:03:15.120 TEST_HEADER include/spdk/accel_module.h 00:03:15.120 CC app/spdk_nvme_discover/discovery_aer.o 00:03:15.120 TEST_HEADER include/spdk/assert.h 00:03:15.120 TEST_HEADER include/spdk/barrier.h 00:03:15.120 TEST_HEADER include/spdk/base64.h 00:03:15.120 TEST_HEADER include/spdk/bdev.h 00:03:15.120 TEST_HEADER include/spdk/bdev_module.h 00:03:15.120 TEST_HEADER include/spdk/bdev_zone.h 00:03:15.120 TEST_HEADER include/spdk/bit_array.h 00:03:15.120 TEST_HEADER include/spdk/bit_pool.h 00:03:15.120 CC app/spdk_top/spdk_top.o 00:03:15.120 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:15.120 TEST_HEADER include/spdk/blob_bdev.h 00:03:15.120 TEST_HEADER include/spdk/blobfs.h 00:03:15.120 TEST_HEADER include/spdk/blob.h 00:03:15.120 TEST_HEADER include/spdk/conf.h 00:03:15.120 CC app/spdk_nvme_identify/identify.o 00:03:15.120 TEST_HEADER include/spdk/config.h 00:03:15.120 TEST_HEADER include/spdk/cpuset.h 00:03:15.120 TEST_HEADER include/spdk/crc16.h 00:03:15.120 TEST_HEADER include/spdk/crc64.h 00:03:15.120 TEST_HEADER include/spdk/dif.h 00:03:15.120 TEST_HEADER include/spdk/crc32.h 00:03:15.120 TEST_HEADER include/spdk/endian.h 00:03:15.120 TEST_HEADER include/spdk/dma.h 00:03:15.120 TEST_HEADER include/spdk/env_dpdk.h 00:03:15.120 TEST_HEADER include/spdk/env.h 00:03:15.120 TEST_HEADER include/spdk/event.h 00:03:15.120 CC app/spdk_lspci/spdk_lspci.o 00:03:15.120 TEST_HEADER include/spdk/fd_group.h 00:03:15.380 TEST_HEADER include/spdk/fd.h 00:03:15.380 TEST_HEADER include/spdk/file.h 00:03:15.380 TEST_HEADER include/spdk/ftl.h 00:03:15.380 TEST_HEADER include/spdk/gpt_spec.h 00:03:15.380 TEST_HEADER include/spdk/hexlify.h 00:03:15.380 TEST_HEADER include/spdk/histogram_data.h 00:03:15.380 TEST_HEADER include/spdk/idxd.h 00:03:15.380 TEST_HEADER include/spdk/idxd_spec.h 00:03:15.380 TEST_HEADER include/spdk/init.h 00:03:15.380 TEST_HEADER include/spdk/ioat.h 00:03:15.380 TEST_HEADER include/spdk/iscsi_spec.h 00:03:15.380 TEST_HEADER include/spdk/ioat_spec.h 00:03:15.380 TEST_HEADER include/spdk/json.h 00:03:15.380 TEST_HEADER include/spdk/jsonrpc.h 00:03:15.380 TEST_HEADER include/spdk/likely.h 00:03:15.380 TEST_HEADER include/spdk/log.h 00:03:15.380 TEST_HEADER include/spdk/lvol.h 00:03:15.380 TEST_HEADER include/spdk/memory.h 00:03:15.380 CC app/iscsi_tgt/iscsi_tgt.o 00:03:15.380 TEST_HEADER include/spdk/mmio.h 00:03:15.380 TEST_HEADER include/spdk/nbd.h 00:03:15.380 TEST_HEADER include/spdk/notify.h 00:03:15.380 TEST_HEADER include/spdk/nvme.h 00:03:15.380 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:15.380 TEST_HEADER include/spdk/nvme_intel.h 00:03:15.380 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:15.380 TEST_HEADER include/spdk/nvme_spec.h 00:03:15.380 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:15.380 TEST_HEADER include/spdk/nvme_zns.h 00:03:15.380 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:15.380 TEST_HEADER include/spdk/nvmf.h 00:03:15.380 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:15.380 TEST_HEADER include/spdk/nvmf_spec.h 00:03:15.380 TEST_HEADER include/spdk/opal.h 00:03:15.380 TEST_HEADER include/spdk/nvmf_transport.h 00:03:15.380 TEST_HEADER include/spdk/opal_spec.h 00:03:15.380 TEST_HEADER include/spdk/pci_ids.h 00:03:15.380 TEST_HEADER include/spdk/pipe.h 00:03:15.380 TEST_HEADER include/spdk/queue.h 00:03:15.380 TEST_HEADER include/spdk/reduce.h 00:03:15.380 TEST_HEADER include/spdk/rpc.h 00:03:15.380 TEST_HEADER include/spdk/scheduler.h 00:03:15.380 TEST_HEADER include/spdk/scsi.h 00:03:15.380 TEST_HEADER include/spdk/scsi_spec.h 00:03:15.380 TEST_HEADER include/spdk/sock.h 00:03:15.380 CC app/nvmf_tgt/nvmf_main.o 00:03:15.380 CC app/vhost/vhost.o 00:03:15.380 TEST_HEADER include/spdk/stdinc.h 00:03:15.380 TEST_HEADER include/spdk/thread.h 00:03:15.380 TEST_HEADER include/spdk/string.h 00:03:15.380 CC app/spdk_dd/spdk_dd.o 00:03:15.380 TEST_HEADER include/spdk/trace.h 00:03:15.380 TEST_HEADER include/spdk/tree.h 00:03:15.380 TEST_HEADER include/spdk/trace_parser.h 00:03:15.380 TEST_HEADER include/spdk/util.h 00:03:15.380 TEST_HEADER include/spdk/ublk.h 00:03:15.380 TEST_HEADER include/spdk/uuid.h 00:03:15.380 TEST_HEADER include/spdk/version.h 00:03:15.380 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:15.380 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:15.380 TEST_HEADER include/spdk/vmd.h 00:03:15.380 TEST_HEADER include/spdk/vhost.h 00:03:15.380 TEST_HEADER include/spdk/xor.h 00:03:15.380 TEST_HEADER include/spdk/zipf.h 00:03:15.380 CXX test/cpp_headers/accel.o 00:03:15.380 CXX test/cpp_headers/accel_module.o 00:03:15.380 CXX test/cpp_headers/barrier.o 00:03:15.380 CXX test/cpp_headers/assert.o 00:03:15.380 CXX test/cpp_headers/base64.o 00:03:15.380 CXX test/cpp_headers/bdev_module.o 00:03:15.380 CXX test/cpp_headers/bdev.o 00:03:15.380 CXX test/cpp_headers/bdev_zone.o 00:03:15.380 CXX test/cpp_headers/bit_array.o 00:03:15.380 CXX test/cpp_headers/blob_bdev.o 00:03:15.380 CXX test/cpp_headers/bit_pool.o 00:03:15.380 CXX test/cpp_headers/blobfs_bdev.o 00:03:15.380 CXX test/cpp_headers/blobfs.o 00:03:15.380 CXX test/cpp_headers/blob.o 00:03:15.380 CXX test/cpp_headers/conf.o 00:03:15.380 CXX test/cpp_headers/config.o 00:03:15.380 CXX test/cpp_headers/cpuset.o 00:03:15.380 CXX test/cpp_headers/crc16.o 00:03:15.380 CXX test/cpp_headers/crc32.o 00:03:15.380 CXX test/cpp_headers/crc64.o 00:03:15.380 CXX test/cpp_headers/dif.o 00:03:15.380 CXX test/cpp_headers/dma.o 00:03:15.380 CXX test/cpp_headers/endian.o 00:03:15.380 CXX test/cpp_headers/env_dpdk.o 00:03:15.380 CXX test/cpp_headers/env.o 00:03:15.380 CXX test/cpp_headers/event.o 00:03:15.380 CXX test/cpp_headers/fd_group.o 00:03:15.380 CXX test/cpp_headers/fd.o 00:03:15.380 CXX test/cpp_headers/file.o 00:03:15.380 CXX test/cpp_headers/ftl.o 00:03:15.380 CXX test/cpp_headers/gpt_spec.o 00:03:15.380 CXX test/cpp_headers/hexlify.o 00:03:15.380 CXX test/cpp_headers/histogram_data.o 00:03:15.380 CXX test/cpp_headers/idxd.o 00:03:15.380 CXX test/cpp_headers/idxd_spec.o 00:03:15.380 CXX test/cpp_headers/init.o 00:03:15.380 CC app/spdk_tgt/spdk_tgt.o 00:03:15.380 CC app/fio/nvme/fio_plugin.o 00:03:15.380 CC examples/util/zipf/zipf.o 00:03:15.380 CC examples/ioat/perf/perf.o 00:03:15.380 CC examples/vmd/lsvmd/lsvmd.o 00:03:15.380 CC examples/vmd/led/led.o 00:03:15.380 CC examples/idxd/perf/perf.o 00:03:15.380 CC examples/nvme/hotplug/hotplug.o 00:03:15.380 CC examples/nvme/abort/abort.o 00:03:15.380 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:15.380 CC test/nvme/err_injection/err_injection.o 00:03:15.380 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:15.380 CC examples/nvme/reconnect/reconnect.o 00:03:15.380 CC examples/nvme/arbitration/arbitration.o 00:03:15.380 CC test/nvme/aer/aer.o 00:03:15.380 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:15.380 CC test/nvme/reset/reset.o 00:03:15.380 CC examples/ioat/verify/verify.o 00:03:15.380 CC test/nvme/e2edp/nvme_dp.o 00:03:15.380 CC test/nvme/simple_copy/simple_copy.o 00:03:15.380 CC test/env/pci/pci_ut.o 00:03:15.380 CC test/nvme/boot_partition/boot_partition.o 00:03:15.380 CC examples/accel/perf/accel_perf.o 00:03:15.380 CC test/nvme/sgl/sgl.o 00:03:15.380 CC examples/nvme/hello_world/hello_world.o 00:03:15.380 CC test/nvme/overhead/overhead.o 00:03:15.380 CXX test/cpp_headers/ioat.o 00:03:15.381 CC test/event/reactor_perf/reactor_perf.o 00:03:15.381 CC test/nvme/reserve/reserve.o 00:03:15.381 CC test/nvme/startup/startup.o 00:03:15.381 CC test/env/memory/memory_ut.o 00:03:15.381 CC test/nvme/fused_ordering/fused_ordering.o 00:03:15.381 CC test/event/reactor/reactor.o 00:03:15.381 CC test/nvme/connect_stress/connect_stress.o 00:03:15.381 CC examples/sock/hello_world/hello_sock.o 00:03:15.381 CC test/app/stub/stub.o 00:03:15.381 CC test/event/event_perf/event_perf.o 00:03:15.381 CC test/env/vtophys/vtophys.o 00:03:15.381 CC test/blobfs/mkfs/mkfs.o 00:03:15.381 CC test/nvme/compliance/nvme_compliance.o 00:03:15.381 CC test/thread/poller_perf/poller_perf.o 00:03:15.381 CC test/app/histogram_perf/histogram_perf.o 00:03:15.381 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:15.381 CC test/thread/lock/spdk_lock.o 00:03:15.381 CC test/nvme/fdp/fdp.o 00:03:15.381 CC test/app/jsoncat/jsoncat.o 00:03:15.381 CC test/nvme/cuse/cuse.o 00:03:15.381 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:15.381 CC test/accel/dif/dif.o 00:03:15.381 CC examples/blob/cli/blobcli.o 00:03:15.381 CC test/dma/test_dma/test_dma.o 00:03:15.381 CC app/fio/bdev/fio_plugin.o 00:03:15.381 CC examples/bdev/hello_world/hello_bdev.o 00:03:15.381 CC test/event/app_repeat/app_repeat.o 00:03:15.381 CC examples/blob/hello_world/hello_blob.o 00:03:15.381 CC examples/bdev/bdevperf/bdevperf.o 00:03:15.381 LINK spdk_lspci 00:03:15.381 CC examples/thread/thread/thread_ex.o 00:03:15.381 CC test/event/scheduler/scheduler.o 00:03:15.381 CC test/bdev/bdevio/bdevio.o 00:03:15.381 LINK rpc_client_test 00:03:15.381 CC test/app/bdev_svc/bdev_svc.o 00:03:15.381 CC examples/nvmf/nvmf/nvmf.o 00:03:15.381 CC test/lvol/esnap/esnap.o 00:03:15.381 CC test/env/mem_callbacks/mem_callbacks.o 00:03:15.648 LINK spdk_nvme_discover 00:03:15.648 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:15.648 LINK interrupt_tgt 00:03:15.648 LINK spdk_trace_record 00:03:15.648 LINK lsvmd 00:03:15.648 LINK led 00:03:15.648 CXX test/cpp_headers/ioat_spec.o 00:03:15.648 CXX test/cpp_headers/iscsi_spec.o 00:03:15.648 CXX test/cpp_headers/json.o 00:03:15.648 CXX test/cpp_headers/jsonrpc.o 00:03:15.648 CXX test/cpp_headers/likely.o 00:03:15.648 LINK iscsi_tgt 00:03:15.648 CXX test/cpp_headers/log.o 00:03:15.648 CXX test/cpp_headers/lvol.o 00:03:15.648 LINK vhost 00:03:15.648 LINK zipf 00:03:15.648 LINK reactor_perf 00:03:15.648 CXX test/cpp_headers/memory.o 00:03:15.648 CXX test/cpp_headers/mmio.o 00:03:15.648 LINK nvmf_tgt 00:03:15.648 CXX test/cpp_headers/nbd.o 00:03:15.648 CXX test/cpp_headers/notify.o 00:03:15.648 CXX test/cpp_headers/nvme.o 00:03:15.648 CXX test/cpp_headers/nvme_intel.o 00:03:15.648 CXX test/cpp_headers/nvme_ocssd.o 00:03:15.648 LINK reactor 00:03:15.648 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:15.648 CXX test/cpp_headers/nvme_spec.o 00:03:15.648 CXX test/cpp_headers/nvme_zns.o 00:03:15.648 CXX test/cpp_headers/nvmf_cmd.o 00:03:15.648 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:15.648 CXX test/cpp_headers/nvmf.o 00:03:15.648 CXX test/cpp_headers/nvmf_spec.o 00:03:15.648 CXX test/cpp_headers/nvmf_transport.o 00:03:15.648 LINK vtophys 00:03:15.648 CXX test/cpp_headers/opal.o 00:03:15.648 LINK event_perf 00:03:15.648 CXX test/cpp_headers/opal_spec.o 00:03:15.648 LINK jsoncat 00:03:15.648 LINK poller_perf 00:03:15.648 CXX test/cpp_headers/pci_ids.o 00:03:15.648 LINK histogram_perf 00:03:15.648 CXX test/cpp_headers/pipe.o 00:03:15.648 LINK env_dpdk_post_init 00:03:15.648 CXX test/cpp_headers/queue.o 00:03:15.648 CXX test/cpp_headers/reduce.o 00:03:15.648 CXX test/cpp_headers/rpc.o 00:03:15.648 LINK boot_partition 00:03:15.648 CXX test/cpp_headers/scheduler.o 00:03:15.648 LINK pmr_persistence 00:03:15.648 CXX test/cpp_headers/scsi.o 00:03:15.648 LINK app_repeat 00:03:15.648 LINK startup 00:03:15.648 CXX test/cpp_headers/scsi_spec.o 00:03:15.648 LINK err_injection 00:03:15.648 LINK connect_stress 00:03:15.648 LINK reserve 00:03:15.648 LINK cmb_copy 00:03:15.649 CXX test/cpp_headers/sock.o 00:03:15.649 LINK spdk_tgt 00:03:15.649 LINK stub 00:03:15.649 LINK doorbell_aers 00:03:15.649 LINK fused_ordering 00:03:15.649 LINK verify 00:03:15.649 LINK hello_world 00:03:15.649 CXX test/cpp_headers/stdinc.o 00:03:15.649 LINK mkfs 00:03:15.649 LINK ioat_perf 00:03:15.649 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:15.649 LINK hotplug 00:03:15.649 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:15.649 LINK reset 00:03:15.649 LINK simple_copy 00:03:15.649 LINK hello_sock 00:03:15.649 LINK sgl 00:03:15.649 LINK bdev_svc 00:03:15.649 LINK hello_bdev 00:03:15.649 LINK overhead 00:03:15.649 LINK aer 00:03:15.649 LINK fdp 00:03:15.649 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:15.649 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:15.649 LINK hello_blob 00:03:15.649 LINK scheduler 00:03:15.649 LINK spdk_trace 00:03:15.649 LINK nvme_dp 00:03:15.909 LINK mem_callbacks 00:03:15.909 CXX test/cpp_headers/string.o 00:03:15.909 LINK thread 00:03:15.909 LINK idxd_perf 00:03:15.909 CXX test/cpp_headers/thread.o 00:03:15.909 CXX test/cpp_headers/trace.o 00:03:15.909 CXX test/cpp_headers/trace_parser.o 00:03:15.909 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:15.909 CXX test/cpp_headers/tree.o 00:03:15.909 CXX test/cpp_headers/ublk.o 00:03:15.909 CXX test/cpp_headers/util.o 00:03:15.909 CXX test/cpp_headers/uuid.o 00:03:15.909 CXX test/cpp_headers/version.o 00:03:15.909 CXX test/cpp_headers/vfio_user_pci.o 00:03:15.909 CXX test/cpp_headers/vfio_user_spec.o 00:03:15.909 CXX test/cpp_headers/vhost.o 00:03:15.909 LINK arbitration 00:03:15.909 CXX test/cpp_headers/vmd.o 00:03:15.909 CXX test/cpp_headers/xor.o 00:03:15.909 LINK reconnect 00:03:15.909 CXX test/cpp_headers/zipf.o 00:03:15.909 LINK nvmf 00:03:15.909 LINK test_dma 00:03:15.909 LINK dif 00:03:15.909 LINK abort 00:03:15.909 LINK accel_perf 00:03:15.910 LINK spdk_dd 00:03:15.910 LINK blobcli 00:03:15.910 LINK bdevio 00:03:15.910 LINK pci_ut 00:03:15.910 LINK nvme_manage 00:03:16.168 LINK nvme_fuzz 00:03:16.168 LINK nvme_compliance 00:03:16.168 LINK spdk_nvme 00:03:16.168 LINK spdk_bdev 00:03:16.168 LINK memory_ut 00:03:16.168 LINK spdk_nvme_identify 00:03:16.168 LINK llvm_vfio_fuzz 00:03:16.168 LINK spdk_nvme_perf 00:03:16.426 LINK cuse 00:03:16.426 LINK vhost_fuzz 00:03:16.426 LINK spdk_top 00:03:16.426 LINK llvm_nvme_fuzz 00:03:16.426 LINK bdevperf 00:03:16.684 LINK spdk_lock 00:03:16.942 LINK iscsi_fuzz 00:03:19.472 LINK esnap 00:03:19.472 00:03:19.472 real 0m23.360s 00:03:19.472 user 4m15.537s 00:03:19.472 sys 2m7.384s 00:03:19.472 12:03:06 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:19.472 12:03:06 -- common/autotest_common.sh@10 -- $ set +x 00:03:19.472 ************************************ 00:03:19.472 END TEST make 00:03:19.472 ************************************ 00:03:19.472 12:03:06 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:19.472 12:03:06 -- nvmf/common.sh@7 -- # uname -s 00:03:19.472 12:03:06 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:19.472 12:03:06 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:19.472 12:03:06 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:19.472 12:03:06 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:19.472 12:03:06 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:19.472 12:03:06 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:19.472 12:03:06 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:19.472 12:03:06 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:19.473 12:03:06 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:19.473 12:03:06 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:19.473 12:03:06 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:19.473 12:03:06 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:19.473 12:03:06 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:19.473 12:03:06 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:19.473 12:03:06 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:19.473 12:03:06 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:19.473 12:03:06 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:19.473 12:03:06 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:19.473 12:03:06 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:19.473 12:03:06 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:19.473 12:03:06 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:19.473 12:03:06 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:19.473 12:03:06 -- paths/export.sh@5 -- # export PATH 00:03:19.473 12:03:06 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:19.473 12:03:06 -- nvmf/common.sh@46 -- # : 0 00:03:19.473 12:03:06 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:19.473 12:03:06 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:19.473 12:03:06 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:19.473 12:03:06 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:19.473 12:03:06 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:19.473 12:03:06 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:19.473 12:03:06 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:19.473 12:03:06 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:19.473 12:03:06 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:19.473 12:03:06 -- spdk/autotest.sh@32 -- # uname -s 00:03:19.473 12:03:06 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:19.473 12:03:06 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:19.473 12:03:06 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:19.473 12:03:06 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:19.473 12:03:06 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:19.473 12:03:06 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:19.473 12:03:06 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:19.473 12:03:06 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:19.473 12:03:06 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:19.473 12:03:06 -- spdk/autotest.sh@48 -- # udevadm_pid=1074622 00:03:19.473 12:03:06 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:19.473 12:03:06 -- spdk/autotest.sh@54 -- # echo 1074624 00:03:19.473 12:03:06 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:19.473 12:03:06 -- spdk/autotest.sh@56 -- # echo 1074625 00:03:19.473 12:03:06 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:19.473 12:03:06 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:19.473 12:03:06 -- spdk/autotest.sh@60 -- # echo 1074626 00:03:19.473 12:03:06 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:19.473 12:03:06 -- spdk/autotest.sh@62 -- # echo 1074627 00:03:19.473 12:03:06 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:19.473 12:03:06 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:19.473 12:03:06 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:19.473 12:03:06 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:19.473 12:03:06 -- common/autotest_common.sh@10 -- # set +x 00:03:19.473 12:03:06 -- spdk/autotest.sh@70 -- # create_test_list 00:03:19.473 12:03:06 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:19.473 12:03:06 -- common/autotest_common.sh@10 -- # set +x 00:03:19.473 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:19.731 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:19.731 12:03:06 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:19.731 12:03:06 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:19.731 12:03:06 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:19.731 12:03:06 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:19.731 12:03:06 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:19.731 12:03:06 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:19.731 12:03:06 -- common/autotest_common.sh@1440 -- # uname 00:03:19.731 12:03:06 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:19.731 12:03:06 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:19.731 12:03:06 -- common/autotest_common.sh@1460 -- # uname 00:03:19.731 12:03:06 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:19.731 12:03:06 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:19.731 12:03:06 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:19.731 12:03:06 -- spdk/autotest.sh@83 -- # hash lcov 00:03:19.731 12:03:06 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:19.731 12:03:06 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:19.731 12:03:06 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:19.731 12:03:06 -- common/autotest_common.sh@10 -- # set +x 00:03:19.731 12:03:06 -- spdk/autotest.sh@102 -- # rm -f 00:03:19.731 12:03:06 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:23.013 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:23.013 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:23.272 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:23.272 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:23.272 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:23.272 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:23.272 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:23.272 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:23.272 12:03:10 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:23.272 12:03:10 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:23.272 12:03:10 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:23.272 12:03:10 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:23.272 12:03:10 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:23.272 12:03:10 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:23.272 12:03:10 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:23.272 12:03:10 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:23.272 12:03:10 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:23.272 12:03:10 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:23.272 12:03:10 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:23.272 12:03:10 -- spdk/autotest.sh@121 -- # grep -v p 00:03:23.272 12:03:10 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:23.272 12:03:10 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:23.272 12:03:10 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:23.272 12:03:10 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:23.272 12:03:10 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:23.272 No valid GPT data, bailing 00:03:23.272 12:03:10 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:23.272 12:03:10 -- scripts/common.sh@393 -- # pt= 00:03:23.272 12:03:10 -- scripts/common.sh@394 -- # return 1 00:03:23.272 12:03:10 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:23.272 1+0 records in 00:03:23.272 1+0 records out 00:03:23.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00552983 s, 190 MB/s 00:03:23.272 12:03:10 -- spdk/autotest.sh@129 -- # sync 00:03:23.272 12:03:10 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:23.272 12:03:10 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:23.272 12:03:10 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:31.396 12:03:17 -- spdk/autotest.sh@135 -- # uname -s 00:03:31.396 12:03:17 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:31.396 12:03:17 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:31.396 12:03:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:31.396 12:03:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:31.396 12:03:17 -- common/autotest_common.sh@10 -- # set +x 00:03:31.396 ************************************ 00:03:31.396 START TEST setup.sh 00:03:31.396 ************************************ 00:03:31.396 12:03:17 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:31.396 * Looking for test storage... 00:03:31.396 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:31.396 12:03:17 -- setup/test-setup.sh@10 -- # uname -s 00:03:31.396 12:03:17 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:31.396 12:03:17 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:31.396 12:03:17 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:31.396 12:03:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:31.396 12:03:17 -- common/autotest_common.sh@10 -- # set +x 00:03:31.396 ************************************ 00:03:31.396 START TEST acl 00:03:31.396 ************************************ 00:03:31.396 12:03:17 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:31.396 * Looking for test storage... 00:03:31.396 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:31.396 12:03:17 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:31.396 12:03:17 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:31.396 12:03:17 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:31.396 12:03:17 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:31.396 12:03:17 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:31.396 12:03:17 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:31.396 12:03:17 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:31.396 12:03:17 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:31.396 12:03:17 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:31.396 12:03:17 -- setup/acl.sh@12 -- # devs=() 00:03:31.396 12:03:17 -- setup/acl.sh@12 -- # declare -a devs 00:03:31.396 12:03:17 -- setup/acl.sh@13 -- # drivers=() 00:03:31.396 12:03:17 -- setup/acl.sh@13 -- # declare -A drivers 00:03:31.396 12:03:17 -- setup/acl.sh@51 -- # setup reset 00:03:31.396 12:03:17 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:31.396 12:03:17 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:34.685 12:03:21 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:34.685 12:03:21 -- setup/acl.sh@16 -- # local dev driver 00:03:34.685 12:03:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.685 12:03:21 -- setup/acl.sh@15 -- # setup output status 00:03:34.685 12:03:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.685 12:03:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:37.972 Hugepages 00:03:37.972 node hugesize free / total 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # continue 00:03:37.972 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # continue 00:03:37.972 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # continue 00:03:37.972 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.972 00:03:37.972 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # continue 00:03:37.972 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.972 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.972 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.972 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.972 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.972 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.972 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.972 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:37.972 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # continue 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:37.973 12:03:24 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:37.973 12:03:24 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:37.973 12:03:24 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:37.973 12:03:24 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:37.973 12:03:24 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:37.973 12:03:24 -- setup/acl.sh@54 -- # run_test denied denied 00:03:37.973 12:03:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:37.973 12:03:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:37.973 12:03:24 -- common/autotest_common.sh@10 -- # set +x 00:03:37.973 ************************************ 00:03:37.973 START TEST denied 00:03:37.973 ************************************ 00:03:37.973 12:03:24 -- common/autotest_common.sh@1104 -- # denied 00:03:37.973 12:03:24 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:37.973 12:03:24 -- setup/acl.sh@38 -- # setup output config 00:03:37.973 12:03:24 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:37.973 12:03:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.973 12:03:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:41.329 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:41.329 12:03:28 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:41.329 12:03:28 -- setup/acl.sh@28 -- # local dev driver 00:03:41.329 12:03:28 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:41.329 12:03:28 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:41.329 12:03:28 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:41.329 12:03:28 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:41.329 12:03:28 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:41.329 12:03:28 -- setup/acl.sh@41 -- # setup reset 00:03:41.329 12:03:28 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:41.329 12:03:28 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:46.619 00:03:46.619 real 0m8.107s 00:03:46.619 user 0m2.478s 00:03:46.619 sys 0m4.947s 00:03:46.619 12:03:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:46.619 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:03:46.619 ************************************ 00:03:46.619 END TEST denied 00:03:46.619 ************************************ 00:03:46.619 12:03:32 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:46.619 12:03:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:46.619 12:03:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:46.619 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:03:46.619 ************************************ 00:03:46.619 START TEST allowed 00:03:46.619 ************************************ 00:03:46.619 12:03:32 -- common/autotest_common.sh@1104 -- # allowed 00:03:46.619 12:03:32 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:46.619 12:03:32 -- setup/acl.sh@45 -- # setup output config 00:03:46.619 12:03:32 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:46.619 12:03:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.619 12:03:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:50.805 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:50.805 12:03:37 -- setup/acl.sh@47 -- # verify 00:03:50.805 12:03:37 -- setup/acl.sh@28 -- # local dev driver 00:03:50.805 12:03:37 -- setup/acl.sh@48 -- # setup reset 00:03:50.805 12:03:37 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:50.805 12:03:37 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:54.997 00:03:54.997 real 0m8.362s 00:03:54.997 user 0m2.222s 00:03:54.997 sys 0m4.627s 00:03:54.997 12:03:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.997 12:03:41 -- common/autotest_common.sh@10 -- # set +x 00:03:54.997 ************************************ 00:03:54.997 END TEST allowed 00:03:54.997 ************************************ 00:03:54.997 00:03:54.997 real 0m23.867s 00:03:54.997 user 0m7.385s 00:03:54.997 sys 0m14.587s 00:03:54.997 12:03:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:54.997 12:03:41 -- common/autotest_common.sh@10 -- # set +x 00:03:54.997 ************************************ 00:03:54.997 END TEST acl 00:03:54.997 ************************************ 00:03:54.997 12:03:41 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.997 12:03:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:54.997 12:03:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:54.997 12:03:41 -- common/autotest_common.sh@10 -- # set +x 00:03:54.997 ************************************ 00:03:54.997 START TEST hugepages 00:03:54.997 ************************************ 00:03:54.997 12:03:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:54.997 * Looking for test storage... 00:03:54.997 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:54.997 12:03:41 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:54.997 12:03:41 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:54.997 12:03:41 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:54.997 12:03:41 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:54.997 12:03:41 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:54.997 12:03:41 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:54.997 12:03:41 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:54.997 12:03:41 -- setup/common.sh@18 -- # local node= 00:03:54.997 12:03:41 -- setup/common.sh@19 -- # local var val 00:03:54.997 12:03:41 -- setup/common.sh@20 -- # local mem_f mem 00:03:54.997 12:03:41 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.997 12:03:41 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.997 12:03:41 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.997 12:03:41 -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.997 12:03:41 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 39151000 kB' 'MemAvailable: 42022116 kB' 'Buffers: 15072 kB' 'Cached: 12389828 kB' 'SwapCached: 60 kB' 'Active: 7363980 kB' 'Inactive: 5519360 kB' 'Active(anon): 6472044 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 481848 kB' 'Mapped: 186604 kB' 'Shmem: 9336604 kB' 'KReclaimable: 565156 kB' 'Slab: 1514644 kB' 'SReclaimable: 565156 kB' 'SUnreclaim: 949488 kB' 'KernelStack: 21744 kB' 'PageTables: 8248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 11057144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217780 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.997 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.997 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.998 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.998 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.999 12:03:41 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.999 12:03:41 -- setup/common.sh@32 -- # continue 00:03:54.999 12:03:41 -- setup/common.sh@31 -- # IFS=': ' 00:03:54.999 12:03:41 -- setup/common.sh@31 -- # read -r var val _ 00:03:54.999 12:03:41 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:54.999 12:03:41 -- setup/common.sh@33 -- # echo 2048 00:03:54.999 12:03:41 -- setup/common.sh@33 -- # return 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:54.999 12:03:41 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:54.999 12:03:41 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:54.999 12:03:41 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:54.999 12:03:41 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:54.999 12:03:41 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:54.999 12:03:41 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:54.999 12:03:41 -- setup/hugepages.sh@207 -- # get_nodes 00:03:54.999 12:03:41 -- setup/hugepages.sh@27 -- # local node 00:03:54.999 12:03:41 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.999 12:03:41 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:54.999 12:03:41 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.999 12:03:41 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:54.999 12:03:41 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:54.999 12:03:41 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:54.999 12:03:41 -- setup/hugepages.sh@208 -- # clear_hp 00:03:54.999 12:03:41 -- setup/hugepages.sh@37 -- # local node hp 00:03:54.999 12:03:41 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.999 12:03:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.999 12:03:41 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.999 12:03:41 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.999 12:03:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.999 12:03:41 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.999 12:03:41 -- setup/hugepages.sh@41 -- # echo 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:54.999 12:03:41 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:54.999 12:03:41 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:54.999 12:03:41 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:54.999 12:03:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:54.999 12:03:41 -- common/autotest_common.sh@10 -- # set +x 00:03:54.999 ************************************ 00:03:54.999 START TEST default_setup 00:03:54.999 ************************************ 00:03:54.999 12:03:41 -- common/autotest_common.sh@1104 -- # default_setup 00:03:54.999 12:03:41 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:54.999 12:03:41 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:54.999 12:03:41 -- setup/hugepages.sh@51 -- # shift 00:03:54.999 12:03:41 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:54.999 12:03:41 -- setup/hugepages.sh@52 -- # local node_ids 00:03:54.999 12:03:41 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:54.999 12:03:41 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:54.999 12:03:41 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:54.999 12:03:41 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:54.999 12:03:41 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:54.999 12:03:41 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:54.999 12:03:41 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:54.999 12:03:41 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:54.999 12:03:41 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:54.999 12:03:41 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:54.999 12:03:41 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:54.999 12:03:41 -- setup/hugepages.sh@73 -- # return 0 00:03:54.999 12:03:41 -- setup/hugepages.sh@137 -- # setup output 00:03:54.999 12:03:41 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.999 12:03:41 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:57.531 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:57.531 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:59.440 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:59.440 12:03:45 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:59.440 12:03:45 -- setup/hugepages.sh@89 -- # local node 00:03:59.440 12:03:45 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:59.440 12:03:45 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:59.440 12:03:45 -- setup/hugepages.sh@92 -- # local surp 00:03:59.440 12:03:45 -- setup/hugepages.sh@93 -- # local resv 00:03:59.440 12:03:45 -- setup/hugepages.sh@94 -- # local anon 00:03:59.440 12:03:45 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:59.440 12:03:45 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:59.440 12:03:45 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:59.440 12:03:45 -- setup/common.sh@18 -- # local node= 00:03:59.440 12:03:45 -- setup/common.sh@19 -- # local var val 00:03:59.440 12:03:45 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.440 12:03:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.440 12:03:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.440 12:03:45 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.440 12:03:45 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.440 12:03:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41384844 kB' 'MemAvailable: 44255928 kB' 'Buffers: 15072 kB' 'Cached: 12389948 kB' 'SwapCached: 60 kB' 'Active: 7372472 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480536 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490344 kB' 'Mapped: 186780 kB' 'Shmem: 9336724 kB' 'KReclaimable: 565124 kB' 'Slab: 1513876 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948752 kB' 'KernelStack: 21952 kB' 'PageTables: 8436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11064512 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217972 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:45 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.440 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.440 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.441 12:03:46 -- setup/common.sh@33 -- # echo 0 00:03:59.441 12:03:46 -- setup/common.sh@33 -- # return 0 00:03:59.441 12:03:46 -- setup/hugepages.sh@97 -- # anon=0 00:03:59.441 12:03:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:59.441 12:03:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.441 12:03:46 -- setup/common.sh@18 -- # local node= 00:03:59.441 12:03:46 -- setup/common.sh@19 -- # local var val 00:03:59.441 12:03:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.441 12:03:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.441 12:03:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.441 12:03:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.441 12:03:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.441 12:03:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41390340 kB' 'MemAvailable: 44261424 kB' 'Buffers: 15072 kB' 'Cached: 12389952 kB' 'SwapCached: 60 kB' 'Active: 7372216 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480280 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490020 kB' 'Mapped: 186744 kB' 'Shmem: 9336728 kB' 'KReclaimable: 565124 kB' 'Slab: 1513876 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948752 kB' 'KernelStack: 22080 kB' 'PageTables: 8712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11064520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217908 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.441 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.441 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.442 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.442 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.442 12:03:46 -- setup/common.sh@33 -- # echo 0 00:03:59.442 12:03:46 -- setup/common.sh@33 -- # return 0 00:03:59.442 12:03:46 -- setup/hugepages.sh@99 -- # surp=0 00:03:59.442 12:03:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:59.442 12:03:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:59.443 12:03:46 -- setup/common.sh@18 -- # local node= 00:03:59.443 12:03:46 -- setup/common.sh@19 -- # local var val 00:03:59.443 12:03:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.443 12:03:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.443 12:03:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.443 12:03:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.443 12:03:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.443 12:03:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41389248 kB' 'MemAvailable: 44260332 kB' 'Buffers: 15072 kB' 'Cached: 12389952 kB' 'SwapCached: 60 kB' 'Active: 7372068 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480132 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489760 kB' 'Mapped: 186652 kB' 'Shmem: 9336728 kB' 'KReclaimable: 565124 kB' 'Slab: 1513732 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948608 kB' 'KernelStack: 22000 kB' 'PageTables: 8616 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11064536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217908 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.443 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.443 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.444 12:03:46 -- setup/common.sh@33 -- # echo 0 00:03:59.444 12:03:46 -- setup/common.sh@33 -- # return 0 00:03:59.444 12:03:46 -- setup/hugepages.sh@100 -- # resv=0 00:03:59.444 12:03:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:59.444 nr_hugepages=1024 00:03:59.444 12:03:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:59.444 resv_hugepages=0 00:03:59.444 12:03:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:59.444 surplus_hugepages=0 00:03:59.444 12:03:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:59.444 anon_hugepages=0 00:03:59.444 12:03:46 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.444 12:03:46 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:59.444 12:03:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:59.444 12:03:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:59.444 12:03:46 -- setup/common.sh@18 -- # local node= 00:03:59.444 12:03:46 -- setup/common.sh@19 -- # local var val 00:03:59.444 12:03:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.444 12:03:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.444 12:03:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.444 12:03:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.444 12:03:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.444 12:03:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41391132 kB' 'MemAvailable: 44262216 kB' 'Buffers: 15072 kB' 'Cached: 12389980 kB' 'SwapCached: 60 kB' 'Active: 7372304 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480368 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489932 kB' 'Mapped: 186652 kB' 'Shmem: 9336756 kB' 'KReclaimable: 565124 kB' 'Slab: 1513732 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948608 kB' 'KernelStack: 22064 kB' 'PageTables: 8744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11064556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217972 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.444 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.444 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.445 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.445 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.446 12:03:46 -- setup/common.sh@33 -- # echo 1024 00:03:59.446 12:03:46 -- setup/common.sh@33 -- # return 0 00:03:59.446 12:03:46 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.446 12:03:46 -- setup/hugepages.sh@112 -- # get_nodes 00:03:59.446 12:03:46 -- setup/hugepages.sh@27 -- # local node 00:03:59.446 12:03:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.446 12:03:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:59.446 12:03:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.446 12:03:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:59.446 12:03:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:59.446 12:03:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:59.446 12:03:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:59.446 12:03:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:59.446 12:03:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:59.446 12:03:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.446 12:03:46 -- setup/common.sh@18 -- # local node=0 00:03:59.446 12:03:46 -- setup/common.sh@19 -- # local var val 00:03:59.446 12:03:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:59.446 12:03:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.446 12:03:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:59.446 12:03:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:59.446 12:03:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.446 12:03:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24021540 kB' 'MemUsed: 8612896 kB' 'SwapCached: 56 kB' 'Active: 4873476 kB' 'Inactive: 391308 kB' 'Active(anon): 4079148 kB' 'Inactive(anon): 120 kB' 'Active(file): 794328 kB' 'Inactive(file): 391188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4871984 kB' 'Mapped: 67420 kB' 'AnonPages: 395952 kB' 'Shmem: 3686412 kB' 'KernelStack: 12088 kB' 'PageTables: 5744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 200096 kB' 'Slab: 640856 kB' 'SReclaimable: 200096 kB' 'SUnreclaim: 440760 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.446 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.446 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # continue 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:59.447 12:03:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:59.447 12:03:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.447 12:03:46 -- setup/common.sh@33 -- # echo 0 00:03:59.447 12:03:46 -- setup/common.sh@33 -- # return 0 00:03:59.447 12:03:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:59.447 12:03:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:59.447 12:03:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:59.447 12:03:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:59.447 12:03:46 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:59.447 node0=1024 expecting 1024 00:03:59.447 12:03:46 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:59.447 00:03:59.447 real 0m4.673s 00:03:59.447 user 0m1.032s 00:03:59.447 sys 0m2.019s 00:03:59.447 12:03:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:59.447 12:03:46 -- common/autotest_common.sh@10 -- # set +x 00:03:59.447 ************************************ 00:03:59.447 END TEST default_setup 00:03:59.447 ************************************ 00:03:59.447 12:03:46 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:59.447 12:03:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:59.447 12:03:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:59.447 12:03:46 -- common/autotest_common.sh@10 -- # set +x 00:03:59.447 ************************************ 00:03:59.447 START TEST per_node_1G_alloc 00:03:59.447 ************************************ 00:03:59.447 12:03:46 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:03:59.447 12:03:46 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:59.447 12:03:46 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:59.447 12:03:46 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:59.447 12:03:46 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:59.447 12:03:46 -- setup/hugepages.sh@51 -- # shift 00:03:59.447 12:03:46 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:59.447 12:03:46 -- setup/hugepages.sh@52 -- # local node_ids 00:03:59.447 12:03:46 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:59.447 12:03:46 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:59.447 12:03:46 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:59.447 12:03:46 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:59.447 12:03:46 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:59.447 12:03:46 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:59.447 12:03:46 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:59.447 12:03:46 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:59.447 12:03:46 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:59.447 12:03:46 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:59.447 12:03:46 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:59.447 12:03:46 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:59.447 12:03:46 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:59.447 12:03:46 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:59.447 12:03:46 -- setup/hugepages.sh@73 -- # return 0 00:03:59.447 12:03:46 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:59.447 12:03:46 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:59.447 12:03:46 -- setup/hugepages.sh@146 -- # setup output 00:03:59.447 12:03:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:59.447 12:03:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:01.977 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.977 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:02.238 12:03:49 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:02.238 12:03:49 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:02.238 12:03:49 -- setup/hugepages.sh@89 -- # local node 00:04:02.238 12:03:49 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:02.238 12:03:49 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:02.238 12:03:49 -- setup/hugepages.sh@92 -- # local surp 00:04:02.238 12:03:49 -- setup/hugepages.sh@93 -- # local resv 00:04:02.238 12:03:49 -- setup/hugepages.sh@94 -- # local anon 00:04:02.238 12:03:49 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:02.238 12:03:49 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:02.238 12:03:49 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:02.238 12:03:49 -- setup/common.sh@18 -- # local node= 00:04:02.238 12:03:49 -- setup/common.sh@19 -- # local var val 00:04:02.238 12:03:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.238 12:03:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.238 12:03:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.238 12:03:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.238 12:03:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.238 12:03:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41367176 kB' 'MemAvailable: 44238260 kB' 'Buffers: 15072 kB' 'Cached: 12390064 kB' 'SwapCached: 60 kB' 'Active: 7374052 kB' 'Inactive: 5519360 kB' 'Active(anon): 6482116 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491564 kB' 'Mapped: 186668 kB' 'Shmem: 9336840 kB' 'KReclaimable: 565124 kB' 'Slab: 1513472 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948348 kB' 'KernelStack: 22272 kB' 'PageTables: 9156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11065344 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.238 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.238 12:03:49 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.239 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.239 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.240 12:03:49 -- setup/common.sh@33 -- # echo 0 00:04:02.240 12:03:49 -- setup/common.sh@33 -- # return 0 00:04:02.240 12:03:49 -- setup/hugepages.sh@97 -- # anon=0 00:04:02.240 12:03:49 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:02.240 12:03:49 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.240 12:03:49 -- setup/common.sh@18 -- # local node= 00:04:02.240 12:03:49 -- setup/common.sh@19 -- # local var val 00:04:02.240 12:03:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.240 12:03:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.240 12:03:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.240 12:03:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.240 12:03:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.240 12:03:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41367620 kB' 'MemAvailable: 44238704 kB' 'Buffers: 15072 kB' 'Cached: 12390064 kB' 'SwapCached: 60 kB' 'Active: 7374600 kB' 'Inactive: 5519360 kB' 'Active(anon): 6482664 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492100 kB' 'Mapped: 186668 kB' 'Shmem: 9336840 kB' 'KReclaimable: 565124 kB' 'Slab: 1513488 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948364 kB' 'KernelStack: 22208 kB' 'PageTables: 9260 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11065356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.240 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.240 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.241 12:03:49 -- setup/common.sh@33 -- # echo 0 00:04:02.241 12:03:49 -- setup/common.sh@33 -- # return 0 00:04:02.241 12:03:49 -- setup/hugepages.sh@99 -- # surp=0 00:04:02.241 12:03:49 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:02.241 12:03:49 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:02.241 12:03:49 -- setup/common.sh@18 -- # local node= 00:04:02.241 12:03:49 -- setup/common.sh@19 -- # local var val 00:04:02.241 12:03:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.241 12:03:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.241 12:03:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.241 12:03:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.241 12:03:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.241 12:03:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41372268 kB' 'MemAvailable: 44243352 kB' 'Buffers: 15072 kB' 'Cached: 12390064 kB' 'SwapCached: 60 kB' 'Active: 7373736 kB' 'Inactive: 5519360 kB' 'Active(anon): 6481800 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491184 kB' 'Mapped: 186660 kB' 'Shmem: 9336840 kB' 'KReclaimable: 565124 kB' 'Slab: 1513508 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948384 kB' 'KernelStack: 22176 kB' 'PageTables: 9196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11065372 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.241 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.241 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.242 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.242 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.243 12:03:49 -- setup/common.sh@33 -- # echo 0 00:04:02.243 12:03:49 -- setup/common.sh@33 -- # return 0 00:04:02.243 12:03:49 -- setup/hugepages.sh@100 -- # resv=0 00:04:02.243 12:03:49 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:02.243 nr_hugepages=1024 00:04:02.243 12:03:49 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:02.243 resv_hugepages=0 00:04:02.243 12:03:49 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:02.243 surplus_hugepages=0 00:04:02.243 12:03:49 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:02.243 anon_hugepages=0 00:04:02.243 12:03:49 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.243 12:03:49 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:02.243 12:03:49 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:02.243 12:03:49 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:02.243 12:03:49 -- setup/common.sh@18 -- # local node= 00:04:02.243 12:03:49 -- setup/common.sh@19 -- # local var val 00:04:02.243 12:03:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.243 12:03:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.243 12:03:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.243 12:03:49 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.243 12:03:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.243 12:03:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41379240 kB' 'MemAvailable: 44250324 kB' 'Buffers: 15072 kB' 'Cached: 12390096 kB' 'SwapCached: 60 kB' 'Active: 7373244 kB' 'Inactive: 5519360 kB' 'Active(anon): 6481308 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490688 kB' 'Mapped: 186660 kB' 'Shmem: 9336872 kB' 'KReclaimable: 565124 kB' 'Slab: 1513348 kB' 'SReclaimable: 565124 kB' 'SUnreclaim: 948224 kB' 'KernelStack: 22016 kB' 'PageTables: 9044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11065388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.243 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.243 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.244 12:03:49 -- setup/common.sh@33 -- # echo 1024 00:04:02.244 12:03:49 -- setup/common.sh@33 -- # return 0 00:04:02.244 12:03:49 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.244 12:03:49 -- setup/hugepages.sh@112 -- # get_nodes 00:04:02.244 12:03:49 -- setup/hugepages.sh@27 -- # local node 00:04:02.244 12:03:49 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.244 12:03:49 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:02.244 12:03:49 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.244 12:03:49 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:02.244 12:03:49 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:02.244 12:03:49 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:02.244 12:03:49 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:02.244 12:03:49 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:02.244 12:03:49 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:02.244 12:03:49 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.244 12:03:49 -- setup/common.sh@18 -- # local node=0 00:04:02.244 12:03:49 -- setup/common.sh@19 -- # local var val 00:04:02.244 12:03:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.244 12:03:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.244 12:03:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:02.244 12:03:49 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:02.244 12:03:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.244 12:03:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 25079960 kB' 'MemUsed: 7554476 kB' 'SwapCached: 56 kB' 'Active: 4873548 kB' 'Inactive: 391308 kB' 'Active(anon): 4079220 kB' 'Inactive(anon): 120 kB' 'Active(file): 794328 kB' 'Inactive(file): 391188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4871992 kB' 'Mapped: 67428 kB' 'AnonPages: 395968 kB' 'Shmem: 3686420 kB' 'KernelStack: 11896 kB' 'PageTables: 5324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 200096 kB' 'Slab: 640908 kB' 'SReclaimable: 200096 kB' 'SUnreclaim: 440812 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.244 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.244 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@33 -- # echo 0 00:04:02.245 12:03:49 -- setup/common.sh@33 -- # return 0 00:04:02.245 12:03:49 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:02.245 12:03:49 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:02.245 12:03:49 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:02.245 12:03:49 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:02.245 12:03:49 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.245 12:03:49 -- setup/common.sh@18 -- # local node=1 00:04:02.245 12:03:49 -- setup/common.sh@19 -- # local var val 00:04:02.245 12:03:49 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.245 12:03:49 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.245 12:03:49 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:02.245 12:03:49 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:02.245 12:03:49 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.245 12:03:49 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.245 12:03:49 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649356 kB' 'MemFree: 16308208 kB' 'MemUsed: 11341148 kB' 'SwapCached: 4 kB' 'Active: 2499620 kB' 'Inactive: 5128052 kB' 'Active(anon): 2402012 kB' 'Inactive(anon): 3342880 kB' 'Active(file): 97608 kB' 'Inactive(file): 1785172 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7533264 kB' 'Mapped: 119248 kB' 'AnonPages: 94608 kB' 'Shmem: 5650480 kB' 'KernelStack: 10040 kB' 'PageTables: 3016 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 365028 kB' 'Slab: 872760 kB' 'SReclaimable: 365028 kB' 'SUnreclaim: 507732 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.245 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.245 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.246 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.246 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.505 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.505 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # continue 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.506 12:03:49 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.506 12:03:49 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.506 12:03:49 -- setup/common.sh@33 -- # echo 0 00:04:02.506 12:03:49 -- setup/common.sh@33 -- # return 0 00:04:02.506 12:03:49 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:02.506 12:03:49 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:02.506 12:03:49 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:02.506 12:03:49 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:02.506 node0=512 expecting 512 00:04:02.506 12:03:49 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:02.506 12:03:49 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:02.506 12:03:49 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:02.506 12:03:49 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:02.506 node1=512 expecting 512 00:04:02.506 12:03:49 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:02.506 00:04:02.506 real 0m3.046s 00:04:02.506 user 0m0.944s 00:04:02.506 sys 0m1.924s 00:04:02.506 12:03:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:02.506 12:03:49 -- common/autotest_common.sh@10 -- # set +x 00:04:02.506 ************************************ 00:04:02.506 END TEST per_node_1G_alloc 00:04:02.506 ************************************ 00:04:02.506 12:03:49 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:02.506 12:03:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:02.506 12:03:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:02.506 12:03:49 -- common/autotest_common.sh@10 -- # set +x 00:04:02.506 ************************************ 00:04:02.506 START TEST even_2G_alloc 00:04:02.506 ************************************ 00:04:02.506 12:03:49 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:04:02.506 12:03:49 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:02.506 12:03:49 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:02.506 12:03:49 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:02.506 12:03:49 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:02.506 12:03:49 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:02.506 12:03:49 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:02.506 12:03:49 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:02.506 12:03:49 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:02.506 12:03:49 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:02.506 12:03:49 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:02.506 12:03:49 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:02.506 12:03:49 -- setup/hugepages.sh@83 -- # : 512 00:04:02.506 12:03:49 -- setup/hugepages.sh@84 -- # : 1 00:04:02.506 12:03:49 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:02.506 12:03:49 -- setup/hugepages.sh@83 -- # : 0 00:04:02.506 12:03:49 -- setup/hugepages.sh@84 -- # : 0 00:04:02.506 12:03:49 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:02.506 12:03:49 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:02.506 12:03:49 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:02.506 12:03:49 -- setup/hugepages.sh@153 -- # setup output 00:04:02.506 12:03:49 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.506 12:03:49 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:05.796 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:05.796 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:05.796 12:03:52 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:05.796 12:03:52 -- setup/hugepages.sh@89 -- # local node 00:04:05.796 12:03:52 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:05.796 12:03:52 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:05.796 12:03:52 -- setup/hugepages.sh@92 -- # local surp 00:04:05.796 12:03:52 -- setup/hugepages.sh@93 -- # local resv 00:04:05.796 12:03:52 -- setup/hugepages.sh@94 -- # local anon 00:04:05.796 12:03:52 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:05.796 12:03:52 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:05.796 12:03:52 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:05.796 12:03:52 -- setup/common.sh@18 -- # local node= 00:04:05.796 12:03:52 -- setup/common.sh@19 -- # local var val 00:04:05.796 12:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.796 12:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.796 12:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.796 12:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.796 12:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.796 12:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.796 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.796 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41407856 kB' 'MemAvailable: 44278780 kB' 'Buffers: 15072 kB' 'Cached: 12390200 kB' 'SwapCached: 60 kB' 'Active: 7372560 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480624 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490012 kB' 'Mapped: 185716 kB' 'Shmem: 9336976 kB' 'KReclaimable: 564964 kB' 'Slab: 1513052 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 948088 kB' 'KernelStack: 21776 kB' 'PageTables: 8020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11053428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.797 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.797 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.798 12:03:52 -- setup/common.sh@33 -- # echo 0 00:04:05.798 12:03:52 -- setup/common.sh@33 -- # return 0 00:04:05.798 12:03:52 -- setup/hugepages.sh@97 -- # anon=0 00:04:05.798 12:03:52 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:05.798 12:03:52 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.798 12:03:52 -- setup/common.sh@18 -- # local node= 00:04:05.798 12:03:52 -- setup/common.sh@19 -- # local var val 00:04:05.798 12:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.798 12:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.798 12:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.798 12:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.798 12:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.798 12:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41408212 kB' 'MemAvailable: 44279136 kB' 'Buffers: 15072 kB' 'Cached: 12390204 kB' 'SwapCached: 60 kB' 'Active: 7372236 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480300 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489716 kB' 'Mapped: 185676 kB' 'Shmem: 9336980 kB' 'KReclaimable: 564964 kB' 'Slab: 1513060 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 948096 kB' 'KernelStack: 21760 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11053440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217940 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.798 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.798 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.799 12:03:52 -- setup/common.sh@33 -- # echo 0 00:04:05.799 12:03:52 -- setup/common.sh@33 -- # return 0 00:04:05.799 12:03:52 -- setup/hugepages.sh@99 -- # surp=0 00:04:05.799 12:03:52 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:05.799 12:03:52 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:05.799 12:03:52 -- setup/common.sh@18 -- # local node= 00:04:05.799 12:03:52 -- setup/common.sh@19 -- # local var val 00:04:05.799 12:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.799 12:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.799 12:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.799 12:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.799 12:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.799 12:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41408212 kB' 'MemAvailable: 44279136 kB' 'Buffers: 15072 kB' 'Cached: 12390216 kB' 'SwapCached: 60 kB' 'Active: 7372260 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480324 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489712 kB' 'Mapped: 185676 kB' 'Shmem: 9336992 kB' 'KReclaimable: 564964 kB' 'Slab: 1513060 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 948096 kB' 'KernelStack: 21760 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11053456 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.799 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.799 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.800 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.800 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.800 12:03:52 -- setup/common.sh@33 -- # echo 0 00:04:05.800 12:03:52 -- setup/common.sh@33 -- # return 0 00:04:05.800 12:03:52 -- setup/hugepages.sh@100 -- # resv=0 00:04:05.800 12:03:52 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:05.800 nr_hugepages=1024 00:04:05.800 12:03:52 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:05.800 resv_hugepages=0 00:04:05.800 12:03:52 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:05.800 surplus_hugepages=0 00:04:05.800 12:03:52 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:05.800 anon_hugepages=0 00:04:05.800 12:03:52 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.800 12:03:52 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:05.800 12:03:52 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:05.800 12:03:52 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:05.800 12:03:52 -- setup/common.sh@18 -- # local node= 00:04:05.800 12:03:52 -- setup/common.sh@19 -- # local var val 00:04:05.800 12:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.801 12:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.801 12:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.801 12:03:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.801 12:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.801 12:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41408716 kB' 'MemAvailable: 44279640 kB' 'Buffers: 15072 kB' 'Cached: 12390240 kB' 'SwapCached: 60 kB' 'Active: 7372264 kB' 'Inactive: 5519360 kB' 'Active(anon): 6480328 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489668 kB' 'Mapped: 185676 kB' 'Shmem: 9337016 kB' 'KReclaimable: 564964 kB' 'Slab: 1513060 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 948096 kB' 'KernelStack: 21760 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11053468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.801 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.801 12:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.802 12:03:52 -- setup/common.sh@33 -- # echo 1024 00:04:05.802 12:03:52 -- setup/common.sh@33 -- # return 0 00:04:05.802 12:03:52 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.802 12:03:52 -- setup/hugepages.sh@112 -- # get_nodes 00:04:05.802 12:03:52 -- setup/hugepages.sh@27 -- # local node 00:04:05.802 12:03:52 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.802 12:03:52 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:05.802 12:03:52 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.802 12:03:52 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:05.802 12:03:52 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:05.802 12:03:52 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:05.802 12:03:52 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.802 12:03:52 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.802 12:03:52 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:05.802 12:03:52 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.802 12:03:52 -- setup/common.sh@18 -- # local node=0 00:04:05.802 12:03:52 -- setup/common.sh@19 -- # local var val 00:04:05.802 12:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.802 12:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.802 12:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:05.802 12:03:52 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:05.802 12:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.802 12:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.802 12:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 25073968 kB' 'MemUsed: 7560468 kB' 'SwapCached: 56 kB' 'Active: 4872740 kB' 'Inactive: 391308 kB' 'Active(anon): 4078412 kB' 'Inactive(anon): 120 kB' 'Active(file): 794328 kB' 'Inactive(file): 391188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4872000 kB' 'Mapped: 66792 kB' 'AnonPages: 395316 kB' 'Shmem: 3686428 kB' 'KernelStack: 11752 kB' 'PageTables: 5100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 199936 kB' 'Slab: 640232 kB' 'SReclaimable: 199936 kB' 'SUnreclaim: 440296 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.802 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.802 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@33 -- # echo 0 00:04:05.803 12:03:52 -- setup/common.sh@33 -- # return 0 00:04:05.803 12:03:52 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.803 12:03:52 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.803 12:03:52 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.803 12:03:52 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:05.803 12:03:52 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.803 12:03:52 -- setup/common.sh@18 -- # local node=1 00:04:05.803 12:03:52 -- setup/common.sh@19 -- # local var val 00:04:05.803 12:03:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.803 12:03:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.803 12:03:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:05.803 12:03:52 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:05.803 12:03:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.803 12:03:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649356 kB' 'MemFree: 16334684 kB' 'MemUsed: 11314672 kB' 'SwapCached: 4 kB' 'Active: 2499200 kB' 'Inactive: 5128052 kB' 'Active(anon): 2401592 kB' 'Inactive(anon): 3342880 kB' 'Active(file): 97608 kB' 'Inactive(file): 1785172 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7533400 kB' 'Mapped: 118884 kB' 'AnonPages: 94008 kB' 'Shmem: 5650616 kB' 'KernelStack: 9992 kB' 'PageTables: 2824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 365028 kB' 'Slab: 872828 kB' 'SReclaimable: 365028 kB' 'SUnreclaim: 507800 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.803 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.803 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # continue 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.804 12:03:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.804 12:03:52 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.804 12:03:52 -- setup/common.sh@33 -- # echo 0 00:04:05.804 12:03:52 -- setup/common.sh@33 -- # return 0 00:04:05.804 12:03:52 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.804 12:03:52 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.804 12:03:52 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.804 12:03:52 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:05.804 node0=512 expecting 512 00:04:05.804 12:03:52 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.804 12:03:52 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.804 12:03:52 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.804 12:03:52 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:05.804 node1=512 expecting 512 00:04:05.804 12:03:52 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:05.804 00:04:05.804 real 0m3.410s 00:04:05.804 user 0m1.150s 00:04:05.804 sys 0m2.211s 00:04:05.804 12:03:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:05.804 12:03:52 -- common/autotest_common.sh@10 -- # set +x 00:04:05.804 ************************************ 00:04:05.804 END TEST even_2G_alloc 00:04:05.804 ************************************ 00:04:05.804 12:03:52 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:05.804 12:03:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:05.804 12:03:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:05.804 12:03:52 -- common/autotest_common.sh@10 -- # set +x 00:04:05.804 ************************************ 00:04:05.804 START TEST odd_alloc 00:04:05.804 ************************************ 00:04:05.804 12:03:52 -- common/autotest_common.sh@1104 -- # odd_alloc 00:04:05.804 12:03:52 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:05.804 12:03:52 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:05.804 12:03:52 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:05.804 12:03:52 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:05.804 12:03:52 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:05.804 12:03:52 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:05.804 12:03:52 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:05.804 12:03:52 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:05.804 12:03:52 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:05.804 12:03:52 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:05.804 12:03:52 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:05.804 12:03:52 -- setup/hugepages.sh@83 -- # : 513 00:04:05.804 12:03:52 -- setup/hugepages.sh@84 -- # : 1 00:04:05.804 12:03:52 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:05.804 12:03:52 -- setup/hugepages.sh@83 -- # : 0 00:04:05.804 12:03:52 -- setup/hugepages.sh@84 -- # : 0 00:04:05.804 12:03:52 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.804 12:03:52 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:05.804 12:03:52 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:05.804 12:03:52 -- setup/hugepages.sh@160 -- # setup output 00:04:05.804 12:03:52 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:05.804 12:03:52 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:09.095 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:09.095 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:09.357 12:03:56 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:09.357 12:03:56 -- setup/hugepages.sh@89 -- # local node 00:04:09.357 12:03:56 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:09.357 12:03:56 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:09.357 12:03:56 -- setup/hugepages.sh@92 -- # local surp 00:04:09.357 12:03:56 -- setup/hugepages.sh@93 -- # local resv 00:04:09.357 12:03:56 -- setup/hugepages.sh@94 -- # local anon 00:04:09.357 12:03:56 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:09.357 12:03:56 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:09.357 12:03:56 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:09.357 12:03:56 -- setup/common.sh@18 -- # local node= 00:04:09.357 12:03:56 -- setup/common.sh@19 -- # local var val 00:04:09.357 12:03:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.357 12:03:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.357 12:03:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.357 12:03:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.357 12:03:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.357 12:03:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.357 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.357 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.357 12:03:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41433128 kB' 'MemAvailable: 44304052 kB' 'Buffers: 15072 kB' 'Cached: 12390328 kB' 'SwapCached: 60 kB' 'Active: 7373580 kB' 'Inactive: 5519360 kB' 'Active(anon): 6481644 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490372 kB' 'Mapped: 185692 kB' 'Shmem: 9337104 kB' 'KReclaimable: 564964 kB' 'Slab: 1512304 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947340 kB' 'KernelStack: 21744 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11054076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:09.357 12:03:56 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.357 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.357 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.357 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.357 12:03:56 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.357 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.357 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.357 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.358 12:03:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.358 12:03:56 -- setup/common.sh@33 -- # echo 0 00:04:09.358 12:03:56 -- setup/common.sh@33 -- # return 0 00:04:09.358 12:03:56 -- setup/hugepages.sh@97 -- # anon=0 00:04:09.358 12:03:56 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:09.358 12:03:56 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.358 12:03:56 -- setup/common.sh@18 -- # local node= 00:04:09.358 12:03:56 -- setup/common.sh@19 -- # local var val 00:04:09.358 12:03:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.358 12:03:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.358 12:03:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.358 12:03:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.358 12:03:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.358 12:03:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.358 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41437880 kB' 'MemAvailable: 44308804 kB' 'Buffers: 15072 kB' 'Cached: 12390332 kB' 'SwapCached: 60 kB' 'Active: 7373072 kB' 'Inactive: 5519360 kB' 'Active(anon): 6481136 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490252 kB' 'Mapped: 185684 kB' 'Shmem: 9337108 kB' 'KReclaimable: 564964 kB' 'Slab: 1512304 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947340 kB' 'KernelStack: 21760 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11054088 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.359 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.359 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.360 12:03:56 -- setup/common.sh@33 -- # echo 0 00:04:09.360 12:03:56 -- setup/common.sh@33 -- # return 0 00:04:09.360 12:03:56 -- setup/hugepages.sh@99 -- # surp=0 00:04:09.360 12:03:56 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:09.360 12:03:56 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:09.360 12:03:56 -- setup/common.sh@18 -- # local node= 00:04:09.360 12:03:56 -- setup/common.sh@19 -- # local var val 00:04:09.360 12:03:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.360 12:03:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.360 12:03:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.360 12:03:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.360 12:03:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.360 12:03:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41438324 kB' 'MemAvailable: 44309248 kB' 'Buffers: 15072 kB' 'Cached: 12390344 kB' 'SwapCached: 60 kB' 'Active: 7373104 kB' 'Inactive: 5519360 kB' 'Active(anon): 6481168 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490252 kB' 'Mapped: 185684 kB' 'Shmem: 9337120 kB' 'KReclaimable: 564964 kB' 'Slab: 1512304 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947340 kB' 'KernelStack: 21760 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11054104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.360 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.360 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.361 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.361 12:03:56 -- setup/common.sh@33 -- # echo 0 00:04:09.361 12:03:56 -- setup/common.sh@33 -- # return 0 00:04:09.361 12:03:56 -- setup/hugepages.sh@100 -- # resv=0 00:04:09.361 12:03:56 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:09.361 nr_hugepages=1025 00:04:09.361 12:03:56 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:09.361 resv_hugepages=0 00:04:09.361 12:03:56 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:09.361 surplus_hugepages=0 00:04:09.361 12:03:56 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:09.361 anon_hugepages=0 00:04:09.361 12:03:56 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:09.361 12:03:56 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:09.361 12:03:56 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:09.361 12:03:56 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:09.361 12:03:56 -- setup/common.sh@18 -- # local node= 00:04:09.361 12:03:56 -- setup/common.sh@19 -- # local var val 00:04:09.361 12:03:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.361 12:03:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.361 12:03:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.361 12:03:56 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.361 12:03:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.361 12:03:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.361 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41439244 kB' 'MemAvailable: 44310168 kB' 'Buffers: 15072 kB' 'Cached: 12390356 kB' 'SwapCached: 60 kB' 'Active: 7373108 kB' 'Inactive: 5519360 kB' 'Active(anon): 6481172 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490252 kB' 'Mapped: 185684 kB' 'Shmem: 9337132 kB' 'KReclaimable: 564964 kB' 'Slab: 1512304 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947340 kB' 'KernelStack: 21760 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11054120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.362 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.362 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.363 12:03:56 -- setup/common.sh@33 -- # echo 1025 00:04:09.363 12:03:56 -- setup/common.sh@33 -- # return 0 00:04:09.363 12:03:56 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:09.363 12:03:56 -- setup/hugepages.sh@112 -- # get_nodes 00:04:09.363 12:03:56 -- setup/hugepages.sh@27 -- # local node 00:04:09.363 12:03:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.363 12:03:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.363 12:03:56 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.363 12:03:56 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:09.363 12:03:56 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:09.363 12:03:56 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:09.363 12:03:56 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.363 12:03:56 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.363 12:03:56 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:09.363 12:03:56 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.363 12:03:56 -- setup/common.sh@18 -- # local node=0 00:04:09.363 12:03:56 -- setup/common.sh@19 -- # local var val 00:04:09.363 12:03:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.363 12:03:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.363 12:03:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:09.363 12:03:56 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:09.363 12:03:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.363 12:03:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.363 12:03:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 25091204 kB' 'MemUsed: 7543232 kB' 'SwapCached: 56 kB' 'Active: 4873792 kB' 'Inactive: 391308 kB' 'Active(anon): 4079464 kB' 'Inactive(anon): 120 kB' 'Active(file): 794328 kB' 'Inactive(file): 391188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4872076 kB' 'Mapped: 66800 kB' 'AnonPages: 396252 kB' 'Shmem: 3686504 kB' 'KernelStack: 11768 kB' 'PageTables: 5152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 199936 kB' 'Slab: 639616 kB' 'SReclaimable: 199936 kB' 'SUnreclaim: 439680 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.363 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.363 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@33 -- # echo 0 00:04:09.364 12:03:56 -- setup/common.sh@33 -- # return 0 00:04:09.364 12:03:56 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.364 12:03:56 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.364 12:03:56 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.364 12:03:56 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:09.364 12:03:56 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.364 12:03:56 -- setup/common.sh@18 -- # local node=1 00:04:09.364 12:03:56 -- setup/common.sh@19 -- # local var val 00:04:09.364 12:03:56 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.364 12:03:56 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.364 12:03:56 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:09.364 12:03:56 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:09.364 12:03:56 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.364 12:03:56 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649356 kB' 'MemFree: 16347788 kB' 'MemUsed: 11301568 kB' 'SwapCached: 4 kB' 'Active: 2499292 kB' 'Inactive: 5128052 kB' 'Active(anon): 2401684 kB' 'Inactive(anon): 3342880 kB' 'Active(file): 97608 kB' 'Inactive(file): 1785172 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7533428 kB' 'Mapped: 118884 kB' 'AnonPages: 94000 kB' 'Shmem: 5650644 kB' 'KernelStack: 9992 kB' 'PageTables: 2824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 365028 kB' 'Slab: 872688 kB' 'SReclaimable: 365028 kB' 'SUnreclaim: 507660 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.364 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.364 12:03:56 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # continue 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.365 12:03:56 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.365 12:03:56 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.365 12:03:56 -- setup/common.sh@33 -- # echo 0 00:04:09.365 12:03:56 -- setup/common.sh@33 -- # return 0 00:04:09.365 12:03:56 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.365 12:03:56 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.365 12:03:56 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.365 12:03:56 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.365 12:03:56 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:09.365 node0=512 expecting 513 00:04:09.365 12:03:56 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.365 12:03:56 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.365 12:03:56 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.365 12:03:56 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:09.365 node1=513 expecting 512 00:04:09.365 12:03:56 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:09.365 00:04:09.365 real 0m3.572s 00:04:09.365 user 0m1.219s 00:04:09.365 sys 0m2.375s 00:04:09.365 12:03:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:09.365 12:03:56 -- common/autotest_common.sh@10 -- # set +x 00:04:09.365 ************************************ 00:04:09.365 END TEST odd_alloc 00:04:09.365 ************************************ 00:04:09.624 12:03:56 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:09.624 12:03:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:09.624 12:03:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:09.624 12:03:56 -- common/autotest_common.sh@10 -- # set +x 00:04:09.624 ************************************ 00:04:09.624 START TEST custom_alloc 00:04:09.624 ************************************ 00:04:09.624 12:03:56 -- common/autotest_common.sh@1104 -- # custom_alloc 00:04:09.624 12:03:56 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:09.624 12:03:56 -- setup/hugepages.sh@169 -- # local node 00:04:09.624 12:03:56 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:09.624 12:03:56 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:09.624 12:03:56 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:09.624 12:03:56 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:09.624 12:03:56 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:09.624 12:03:56 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:09.624 12:03:56 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:09.624 12:03:56 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.624 12:03:56 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.624 12:03:56 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:09.624 12:03:56 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.624 12:03:56 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.624 12:03:56 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.624 12:03:56 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:09.624 12:03:56 -- setup/hugepages.sh@83 -- # : 256 00:04:09.624 12:03:56 -- setup/hugepages.sh@84 -- # : 1 00:04:09.624 12:03:56 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:09.624 12:03:56 -- setup/hugepages.sh@83 -- # : 0 00:04:09.624 12:03:56 -- setup/hugepages.sh@84 -- # : 0 00:04:09.624 12:03:56 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:09.624 12:03:56 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:09.624 12:03:56 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:09.624 12:03:56 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:09.624 12:03:56 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:09.624 12:03:56 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.624 12:03:56 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.624 12:03:56 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:09.624 12:03:56 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.624 12:03:56 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.624 12:03:56 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.624 12:03:56 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:09.624 12:03:56 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:09.624 12:03:56 -- setup/hugepages.sh@78 -- # return 0 00:04:09.624 12:03:56 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:09.624 12:03:56 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:09.624 12:03:56 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:09.624 12:03:56 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:09.624 12:03:56 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:09.624 12:03:56 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:09.624 12:03:56 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.624 12:03:56 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.624 12:03:56 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:09.624 12:03:56 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.624 12:03:56 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.624 12:03:56 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.624 12:03:56 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:09.624 12:03:56 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:09.624 12:03:56 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:09.624 12:03:56 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:09.624 12:03:56 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:09.624 12:03:56 -- setup/hugepages.sh@78 -- # return 0 00:04:09.624 12:03:56 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:09.624 12:03:56 -- setup/hugepages.sh@187 -- # setup output 00:04:09.624 12:03:56 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:09.624 12:03:56 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:12.912 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:12.912 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:12.912 12:03:59 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:12.912 12:03:59 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:12.912 12:03:59 -- setup/hugepages.sh@89 -- # local node 00:04:12.912 12:03:59 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:12.912 12:03:59 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:12.912 12:03:59 -- setup/hugepages.sh@92 -- # local surp 00:04:12.912 12:03:59 -- setup/hugepages.sh@93 -- # local resv 00:04:12.912 12:03:59 -- setup/hugepages.sh@94 -- # local anon 00:04:12.912 12:03:59 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:12.912 12:03:59 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:12.912 12:03:59 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:12.912 12:03:59 -- setup/common.sh@18 -- # local node= 00:04:12.912 12:03:59 -- setup/common.sh@19 -- # local var val 00:04:12.912 12:03:59 -- setup/common.sh@20 -- # local mem_f mem 00:04:12.912 12:03:59 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:12.912 12:03:59 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:12.912 12:03:59 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:12.912 12:03:59 -- setup/common.sh@28 -- # mapfile -t mem 00:04:12.912 12:03:59 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:12.912 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.912 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.912 12:03:59 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 40384172 kB' 'MemAvailable: 43255096 kB' 'Buffers: 15072 kB' 'Cached: 12390468 kB' 'SwapCached: 60 kB' 'Active: 7374736 kB' 'Inactive: 5519360 kB' 'Active(anon): 6482800 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491696 kB' 'Mapped: 185732 kB' 'Shmem: 9337244 kB' 'KReclaimable: 564964 kB' 'Slab: 1512668 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947704 kB' 'KernelStack: 21984 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11059376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218292 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:12.913 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:12.913 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.176 12:03:59 -- setup/common.sh@33 -- # echo 0 00:04:13.176 12:03:59 -- setup/common.sh@33 -- # return 0 00:04:13.176 12:03:59 -- setup/hugepages.sh@97 -- # anon=0 00:04:13.176 12:03:59 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.176 12:03:59 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.176 12:03:59 -- setup/common.sh@18 -- # local node= 00:04:13.176 12:03:59 -- setup/common.sh@19 -- # local var val 00:04:13.176 12:03:59 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.176 12:03:59 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.176 12:03:59 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.176 12:03:59 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.176 12:03:59 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.176 12:03:59 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 40383388 kB' 'MemAvailable: 43254312 kB' 'Buffers: 15072 kB' 'Cached: 12390472 kB' 'SwapCached: 60 kB' 'Active: 7374904 kB' 'Inactive: 5519360 kB' 'Active(anon): 6482968 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491868 kB' 'Mapped: 185688 kB' 'Shmem: 9337248 kB' 'KReclaimable: 564964 kB' 'Slab: 1512664 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947700 kB' 'KernelStack: 21952 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11059388 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.176 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.176 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 12:03:59 -- setup/common.sh@33 -- # echo 0 00:04:13.178 12:03:59 -- setup/common.sh@33 -- # return 0 00:04:13.178 12:03:59 -- setup/hugepages.sh@99 -- # surp=0 00:04:13.178 12:03:59 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.178 12:03:59 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.178 12:03:59 -- setup/common.sh@18 -- # local node= 00:04:13.178 12:03:59 -- setup/common.sh@19 -- # local var val 00:04:13.178 12:03:59 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.178 12:03:59 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.178 12:03:59 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.178 12:03:59 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.178 12:03:59 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.178 12:03:59 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 40385968 kB' 'MemAvailable: 43256892 kB' 'Buffers: 15072 kB' 'Cached: 12390484 kB' 'SwapCached: 60 kB' 'Active: 7375064 kB' 'Inactive: 5519360 kB' 'Active(anon): 6483128 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492028 kB' 'Mapped: 185688 kB' 'Shmem: 9337260 kB' 'KReclaimable: 564964 kB' 'Slab: 1512708 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947744 kB' 'KernelStack: 21968 kB' 'PageTables: 8276 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11059036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218260 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.178 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.179 12:03:59 -- setup/common.sh@33 -- # echo 0 00:04:13.179 12:03:59 -- setup/common.sh@33 -- # return 0 00:04:13.179 12:03:59 -- setup/hugepages.sh@100 -- # resv=0 00:04:13.179 12:03:59 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:13.179 nr_hugepages=1536 00:04:13.179 12:03:59 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.179 resv_hugepages=0 00:04:13.179 12:03:59 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.179 surplus_hugepages=0 00:04:13.179 12:03:59 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.179 anon_hugepages=0 00:04:13.179 12:03:59 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:13.179 12:03:59 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:13.179 12:03:59 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.179 12:03:59 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.179 12:03:59 -- setup/common.sh@18 -- # local node= 00:04:13.179 12:03:59 -- setup/common.sh@19 -- # local var val 00:04:13.179 12:03:59 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.179 12:03:59 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.179 12:03:59 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.179 12:03:59 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.179 12:03:59 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.179 12:03:59 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 40392356 kB' 'MemAvailable: 43263280 kB' 'Buffers: 15072 kB' 'Cached: 12390504 kB' 'SwapCached: 60 kB' 'Active: 7375800 kB' 'Inactive: 5519360 kB' 'Active(anon): 6483864 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492728 kB' 'Mapped: 185736 kB' 'Shmem: 9337280 kB' 'KReclaimable: 564964 kB' 'Slab: 1512612 kB' 'SReclaimable: 564964 kB' 'SUnreclaim: 947648 kB' 'KernelStack: 21968 kB' 'PageTables: 8548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11059060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218244 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.179 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:03:59 -- setup/common.sh@32 -- # continue 00:04:13.180 12:03:59 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.180 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.180 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.181 12:04:00 -- setup/common.sh@33 -- # echo 1536 00:04:13.181 12:04:00 -- setup/common.sh@33 -- # return 0 00:04:13.181 12:04:00 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:13.181 12:04:00 -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.181 12:04:00 -- setup/hugepages.sh@27 -- # local node 00:04:13.181 12:04:00 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.181 12:04:00 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:13.181 12:04:00 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.181 12:04:00 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:13.181 12:04:00 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.181 12:04:00 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.181 12:04:00 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.181 12:04:00 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.181 12:04:00 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.181 12:04:00 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.181 12:04:00 -- setup/common.sh@18 -- # local node=0 00:04:13.181 12:04:00 -- setup/common.sh@19 -- # local var val 00:04:13.181 12:04:00 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.181 12:04:00 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.181 12:04:00 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.181 12:04:00 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.181 12:04:00 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.181 12:04:00 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.181 12:04:00 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 25089796 kB' 'MemUsed: 7544640 kB' 'SwapCached: 56 kB' 'Active: 4876836 kB' 'Inactive: 391308 kB' 'Active(anon): 4082508 kB' 'Inactive(anon): 120 kB' 'Active(file): 794328 kB' 'Inactive(file): 391188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4872168 kB' 'Mapped: 66804 kB' 'AnonPages: 399320 kB' 'Shmem: 3686596 kB' 'KernelStack: 11768 kB' 'PageTables: 5092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 199936 kB' 'Slab: 639816 kB' 'SReclaimable: 199936 kB' 'SUnreclaim: 439880 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.181 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.181 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@33 -- # echo 0 00:04:13.182 12:04:00 -- setup/common.sh@33 -- # return 0 00:04:13.182 12:04:00 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.182 12:04:00 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.182 12:04:00 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.182 12:04:00 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:13.182 12:04:00 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.182 12:04:00 -- setup/common.sh@18 -- # local node=1 00:04:13.182 12:04:00 -- setup/common.sh@19 -- # local var val 00:04:13.182 12:04:00 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.182 12:04:00 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.182 12:04:00 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:13.182 12:04:00 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:13.182 12:04:00 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.182 12:04:00 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649356 kB' 'MemFree: 15308808 kB' 'MemUsed: 12340548 kB' 'SwapCached: 4 kB' 'Active: 2499740 kB' 'Inactive: 5128052 kB' 'Active(anon): 2402132 kB' 'Inactive(anon): 3342880 kB' 'Active(file): 97608 kB' 'Inactive(file): 1785172 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7533484 kB' 'Mapped: 119620 kB' 'AnonPages: 94396 kB' 'Shmem: 5650700 kB' 'KernelStack: 10024 kB' 'PageTables: 2800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 365028 kB' 'Slab: 872760 kB' 'SReclaimable: 365028 kB' 'SUnreclaim: 507732 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.182 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.182 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # continue 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.183 12:04:00 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.183 12:04:00 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.183 12:04:00 -- setup/common.sh@33 -- # echo 0 00:04:13.183 12:04:00 -- setup/common.sh@33 -- # return 0 00:04:13.183 12:04:00 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.183 12:04:00 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.183 12:04:00 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.183 12:04:00 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.183 12:04:00 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:13.183 node0=512 expecting 512 00:04:13.183 12:04:00 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.183 12:04:00 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.183 12:04:00 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.183 12:04:00 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:13.183 node1=1024 expecting 1024 00:04:13.183 12:04:00 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:13.183 00:04:13.183 real 0m3.737s 00:04:13.183 user 0m1.387s 00:04:13.183 sys 0m2.413s 00:04:13.183 12:04:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.183 12:04:00 -- common/autotest_common.sh@10 -- # set +x 00:04:13.183 ************************************ 00:04:13.183 END TEST custom_alloc 00:04:13.183 ************************************ 00:04:13.183 12:04:00 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:13.183 12:04:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:13.183 12:04:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:13.183 12:04:00 -- common/autotest_common.sh@10 -- # set +x 00:04:13.183 ************************************ 00:04:13.183 START TEST no_shrink_alloc 00:04:13.183 ************************************ 00:04:13.183 12:04:00 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:04:13.183 12:04:00 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:13.183 12:04:00 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:13.183 12:04:00 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:13.183 12:04:00 -- setup/hugepages.sh@51 -- # shift 00:04:13.183 12:04:00 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:13.183 12:04:00 -- setup/hugepages.sh@52 -- # local node_ids 00:04:13.183 12:04:00 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.183 12:04:00 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:13.183 12:04:00 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:13.183 12:04:00 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:13.183 12:04:00 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.183 12:04:00 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:13.183 12:04:00 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.183 12:04:00 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.183 12:04:00 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.183 12:04:00 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:13.183 12:04:00 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:13.183 12:04:00 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:13.183 12:04:00 -- setup/hugepages.sh@73 -- # return 0 00:04:13.183 12:04:00 -- setup/hugepages.sh@198 -- # setup output 00:04:13.183 12:04:00 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.183 12:04:00 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:17.378 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.378 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:17.378 12:04:03 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:17.378 12:04:03 -- setup/hugepages.sh@89 -- # local node 00:04:17.378 12:04:03 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:17.378 12:04:03 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:17.378 12:04:03 -- setup/hugepages.sh@92 -- # local surp 00:04:17.378 12:04:03 -- setup/hugepages.sh@93 -- # local resv 00:04:17.378 12:04:03 -- setup/hugepages.sh@94 -- # local anon 00:04:17.378 12:04:03 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:17.378 12:04:03 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:17.378 12:04:03 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:17.378 12:04:03 -- setup/common.sh@18 -- # local node= 00:04:17.378 12:04:03 -- setup/common.sh@19 -- # local var val 00:04:17.378 12:04:03 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.378 12:04:03 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.378 12:04:03 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.378 12:04:03 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.378 12:04:03 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.378 12:04:03 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41400168 kB' 'MemAvailable: 44271028 kB' 'Buffers: 15072 kB' 'Cached: 12390608 kB' 'SwapCached: 60 kB' 'Active: 7373924 kB' 'Inactive: 5519360 kB' 'Active(anon): 6481988 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490924 kB' 'Mapped: 185692 kB' 'Shmem: 9337384 kB' 'KReclaimable: 564900 kB' 'Slab: 1511876 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 946976 kB' 'KernelStack: 21776 kB' 'PageTables: 8040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11055640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217988 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.378 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.378 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.379 12:04:03 -- setup/common.sh@33 -- # echo 0 00:04:17.379 12:04:03 -- setup/common.sh@33 -- # return 0 00:04:17.379 12:04:03 -- setup/hugepages.sh@97 -- # anon=0 00:04:17.379 12:04:03 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:17.379 12:04:03 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.379 12:04:03 -- setup/common.sh@18 -- # local node= 00:04:17.379 12:04:03 -- setup/common.sh@19 -- # local var val 00:04:17.379 12:04:03 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.379 12:04:03 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.379 12:04:03 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.379 12:04:03 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.379 12:04:03 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.379 12:04:03 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41401228 kB' 'MemAvailable: 44272088 kB' 'Buffers: 15072 kB' 'Cached: 12390612 kB' 'SwapCached: 60 kB' 'Active: 7374144 kB' 'Inactive: 5519360 kB' 'Active(anon): 6482208 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491148 kB' 'Mapped: 185692 kB' 'Shmem: 9337388 kB' 'KReclaimable: 564900 kB' 'Slab: 1511924 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 947024 kB' 'KernelStack: 21760 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11055652 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217956 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.379 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.379 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.380 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.380 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.381 12:04:03 -- setup/common.sh@33 -- # echo 0 00:04:17.381 12:04:03 -- setup/common.sh@33 -- # return 0 00:04:17.381 12:04:03 -- setup/hugepages.sh@99 -- # surp=0 00:04:17.381 12:04:03 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:17.381 12:04:03 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:17.381 12:04:03 -- setup/common.sh@18 -- # local node= 00:04:17.381 12:04:03 -- setup/common.sh@19 -- # local var val 00:04:17.381 12:04:03 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.381 12:04:03 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.381 12:04:03 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.381 12:04:03 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.381 12:04:03 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.381 12:04:03 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41400724 kB' 'MemAvailable: 44271584 kB' 'Buffers: 15072 kB' 'Cached: 12390612 kB' 'SwapCached: 60 kB' 'Active: 7374184 kB' 'Inactive: 5519360 kB' 'Active(anon): 6482248 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491184 kB' 'Mapped: 185692 kB' 'Shmem: 9337388 kB' 'KReclaimable: 564900 kB' 'Slab: 1511924 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 947024 kB' 'KernelStack: 21776 kB' 'PageTables: 8032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11055668 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217940 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.381 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.381 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.382 12:04:03 -- setup/common.sh@33 -- # echo 0 00:04:17.382 12:04:03 -- setup/common.sh@33 -- # return 0 00:04:17.382 12:04:03 -- setup/hugepages.sh@100 -- # resv=0 00:04:17.382 12:04:03 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:17.382 nr_hugepages=1024 00:04:17.382 12:04:03 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:17.382 resv_hugepages=0 00:04:17.382 12:04:03 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:17.382 surplus_hugepages=0 00:04:17.382 12:04:03 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:17.382 anon_hugepages=0 00:04:17.382 12:04:03 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.382 12:04:03 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:17.382 12:04:03 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:17.382 12:04:03 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:17.382 12:04:03 -- setup/common.sh@18 -- # local node= 00:04:17.382 12:04:03 -- setup/common.sh@19 -- # local var val 00:04:17.382 12:04:03 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.382 12:04:03 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.382 12:04:03 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.382 12:04:03 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.382 12:04:03 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.382 12:04:03 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.382 12:04:03 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41400932 kB' 'MemAvailable: 44271792 kB' 'Buffers: 15072 kB' 'Cached: 12390636 kB' 'SwapCached: 60 kB' 'Active: 7374180 kB' 'Inactive: 5519360 kB' 'Active(anon): 6482244 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 491152 kB' 'Mapped: 185692 kB' 'Shmem: 9337412 kB' 'KReclaimable: 564900 kB' 'Slab: 1511924 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 947024 kB' 'KernelStack: 21760 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11055684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217940 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.382 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.382 12:04:03 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.383 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.383 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.384 12:04:03 -- setup/common.sh@33 -- # echo 1024 00:04:17.384 12:04:03 -- setup/common.sh@33 -- # return 0 00:04:17.384 12:04:03 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.384 12:04:03 -- setup/hugepages.sh@112 -- # get_nodes 00:04:17.384 12:04:03 -- setup/hugepages.sh@27 -- # local node 00:04:17.384 12:04:03 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.384 12:04:03 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:17.384 12:04:03 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.384 12:04:03 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:17.384 12:04:03 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:17.384 12:04:03 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:17.384 12:04:03 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.384 12:04:03 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.384 12:04:03 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:17.384 12:04:03 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.384 12:04:03 -- setup/common.sh@18 -- # local node=0 00:04:17.384 12:04:03 -- setup/common.sh@19 -- # local var val 00:04:17.384 12:04:03 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.384 12:04:03 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.384 12:04:03 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:17.384 12:04:03 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:17.384 12:04:03 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.384 12:04:03 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24038460 kB' 'MemUsed: 8595976 kB' 'SwapCached: 56 kB' 'Active: 4874672 kB' 'Inactive: 391308 kB' 'Active(anon): 4080344 kB' 'Inactive(anon): 120 kB' 'Active(file): 794328 kB' 'Inactive(file): 391188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4872216 kB' 'Mapped: 66808 kB' 'AnonPages: 397072 kB' 'Shmem: 3686644 kB' 'KernelStack: 11784 kB' 'PageTables: 5144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 199872 kB' 'Slab: 639156 kB' 'SReclaimable: 199872 kB' 'SUnreclaim: 439284 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.384 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.384 12:04:03 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # continue 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.385 12:04:03 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.385 12:04:03 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.385 12:04:03 -- setup/common.sh@33 -- # echo 0 00:04:17.385 12:04:03 -- setup/common.sh@33 -- # return 0 00:04:17.385 12:04:03 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.385 12:04:03 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.385 12:04:03 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.385 12:04:03 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.385 12:04:03 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:17.385 node0=1024 expecting 1024 00:04:17.385 12:04:03 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:17.385 12:04:03 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:17.385 12:04:03 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:17.385 12:04:03 -- setup/hugepages.sh@202 -- # setup output 00:04:17.385 12:04:03 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.385 12:04:03 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.677 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.677 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:20.677 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:20.677 12:04:07 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:20.677 12:04:07 -- setup/hugepages.sh@89 -- # local node 00:04:20.677 12:04:07 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.677 12:04:07 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.677 12:04:07 -- setup/hugepages.sh@92 -- # local surp 00:04:20.677 12:04:07 -- setup/hugepages.sh@93 -- # local resv 00:04:20.677 12:04:07 -- setup/hugepages.sh@94 -- # local anon 00:04:20.677 12:04:07 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.677 12:04:07 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.677 12:04:07 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.677 12:04:07 -- setup/common.sh@18 -- # local node= 00:04:20.677 12:04:07 -- setup/common.sh@19 -- # local var val 00:04:20.677 12:04:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.677 12:04:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.677 12:04:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.677 12:04:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.677 12:04:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.677 12:04:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.677 12:04:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41424872 kB' 'MemAvailable: 44295732 kB' 'Buffers: 15072 kB' 'Cached: 12390724 kB' 'SwapCached: 60 kB' 'Active: 7375476 kB' 'Inactive: 5519360 kB' 'Active(anon): 6483540 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492264 kB' 'Mapped: 185732 kB' 'Shmem: 9337500 kB' 'KReclaimable: 564900 kB' 'Slab: 1511796 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 946896 kB' 'KernelStack: 21936 kB' 'PageTables: 8324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11060556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.677 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.677 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.678 12:04:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.678 12:04:07 -- setup/common.sh@33 -- # echo 0 00:04:20.678 12:04:07 -- setup/common.sh@33 -- # return 0 00:04:20.678 12:04:07 -- setup/hugepages.sh@97 -- # anon=0 00:04:20.678 12:04:07 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.678 12:04:07 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.678 12:04:07 -- setup/common.sh@18 -- # local node= 00:04:20.678 12:04:07 -- setup/common.sh@19 -- # local var val 00:04:20.678 12:04:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.678 12:04:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.678 12:04:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.678 12:04:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.678 12:04:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.678 12:04:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.678 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41423408 kB' 'MemAvailable: 44294268 kB' 'Buffers: 15072 kB' 'Cached: 12390728 kB' 'SwapCached: 60 kB' 'Active: 7376040 kB' 'Inactive: 5519360 kB' 'Active(anon): 6484104 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492796 kB' 'Mapped: 185696 kB' 'Shmem: 9337504 kB' 'KReclaimable: 564900 kB' 'Slab: 1511812 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 946912 kB' 'KernelStack: 21936 kB' 'PageTables: 8312 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11060816 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.679 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.679 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.680 12:04:07 -- setup/common.sh@33 -- # echo 0 00:04:20.680 12:04:07 -- setup/common.sh@33 -- # return 0 00:04:20.680 12:04:07 -- setup/hugepages.sh@99 -- # surp=0 00:04:20.680 12:04:07 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.680 12:04:07 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.680 12:04:07 -- setup/common.sh@18 -- # local node= 00:04:20.680 12:04:07 -- setup/common.sh@19 -- # local var val 00:04:20.680 12:04:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.680 12:04:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.680 12:04:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.680 12:04:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.680 12:04:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.680 12:04:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41422776 kB' 'MemAvailable: 44293636 kB' 'Buffers: 15072 kB' 'Cached: 12390740 kB' 'SwapCached: 60 kB' 'Active: 7375452 kB' 'Inactive: 5519360 kB' 'Active(anon): 6483516 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492180 kB' 'Mapped: 185696 kB' 'Shmem: 9337516 kB' 'KReclaimable: 564900 kB' 'Slab: 1511868 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 946968 kB' 'KernelStack: 21888 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11060832 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218164 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.680 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.680 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.681 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.681 12:04:07 -- setup/common.sh@33 -- # echo 0 00:04:20.681 12:04:07 -- setup/common.sh@33 -- # return 0 00:04:20.681 12:04:07 -- setup/hugepages.sh@100 -- # resv=0 00:04:20.681 12:04:07 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.681 nr_hugepages=1024 00:04:20.681 12:04:07 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.681 resv_hugepages=0 00:04:20.681 12:04:07 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.681 surplus_hugepages=0 00:04:20.681 12:04:07 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.681 anon_hugepages=0 00:04:20.681 12:04:07 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.681 12:04:07 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.681 12:04:07 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.681 12:04:07 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.681 12:04:07 -- setup/common.sh@18 -- # local node= 00:04:20.681 12:04:07 -- setup/common.sh@19 -- # local var val 00:04:20.681 12:04:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.681 12:04:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.681 12:04:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.681 12:04:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.681 12:04:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.681 12:04:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.681 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283792 kB' 'MemFree: 41421408 kB' 'MemAvailable: 44292268 kB' 'Buffers: 15072 kB' 'Cached: 12390740 kB' 'SwapCached: 60 kB' 'Active: 7375712 kB' 'Inactive: 5519360 kB' 'Active(anon): 6483776 kB' 'Inactive(anon): 3343000 kB' 'Active(file): 891936 kB' 'Inactive(file): 2176360 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8385788 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492440 kB' 'Mapped: 185696 kB' 'Shmem: 9337516 kB' 'KReclaimable: 564900 kB' 'Slab: 1511868 kB' 'SReclaimable: 564900 kB' 'SUnreclaim: 946968 kB' 'KernelStack: 21920 kB' 'PageTables: 8388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11060848 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 99904 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2864500 kB' 'DirectMap2M: 40861696 kB' 'DirectMap1G: 26214400 kB' 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.682 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.682 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.683 12:04:07 -- setup/common.sh@33 -- # echo 1024 00:04:20.683 12:04:07 -- setup/common.sh@33 -- # return 0 00:04:20.683 12:04:07 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.683 12:04:07 -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.683 12:04:07 -- setup/hugepages.sh@27 -- # local node 00:04:20.683 12:04:07 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.683 12:04:07 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:20.683 12:04:07 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.683 12:04:07 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:20.683 12:04:07 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.683 12:04:07 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.683 12:04:07 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.683 12:04:07 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.683 12:04:07 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.683 12:04:07 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.683 12:04:07 -- setup/common.sh@18 -- # local node=0 00:04:20.683 12:04:07 -- setup/common.sh@19 -- # local var val 00:04:20.683 12:04:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.683 12:04:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.683 12:04:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.683 12:04:07 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.683 12:04:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.683 12:04:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24041732 kB' 'MemUsed: 8592704 kB' 'SwapCached: 56 kB' 'Active: 4876612 kB' 'Inactive: 391308 kB' 'Active(anon): 4082284 kB' 'Inactive(anon): 120 kB' 'Active(file): 794328 kB' 'Inactive(file): 391188 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4872264 kB' 'Mapped: 66812 kB' 'AnonPages: 398856 kB' 'Shmem: 3686692 kB' 'KernelStack: 12040 kB' 'PageTables: 5864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 199872 kB' 'Slab: 639012 kB' 'SReclaimable: 199872 kB' 'SUnreclaim: 439140 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.683 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.683 12:04:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # continue 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.684 12:04:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.684 12:04:07 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.684 12:04:07 -- setup/common.sh@33 -- # echo 0 00:04:20.684 12:04:07 -- setup/common.sh@33 -- # return 0 00:04:20.684 12:04:07 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.684 12:04:07 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.684 12:04:07 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.684 12:04:07 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.684 12:04:07 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:20.684 node0=1024 expecting 1024 00:04:20.684 12:04:07 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:20.684 00:04:20.684 real 0m7.450s 00:04:20.684 user 0m2.767s 00:04:20.684 sys 0m4.821s 00:04:20.684 12:04:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:20.684 12:04:07 -- common/autotest_common.sh@10 -- # set +x 00:04:20.684 ************************************ 00:04:20.684 END TEST no_shrink_alloc 00:04:20.684 ************************************ 00:04:20.684 12:04:07 -- setup/hugepages.sh@217 -- # clear_hp 00:04:20.684 12:04:07 -- setup/hugepages.sh@37 -- # local node hp 00:04:20.684 12:04:07 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:20.684 12:04:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.684 12:04:07 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.684 12:04:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.684 12:04:07 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.684 12:04:07 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:20.684 12:04:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.684 12:04:07 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.684 12:04:07 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.684 12:04:07 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.684 12:04:07 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:20.684 12:04:07 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:20.684 00:04:20.684 real 0m26.356s 00:04:20.684 user 0m8.677s 00:04:20.684 sys 0m16.113s 00:04:20.684 12:04:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:20.684 12:04:07 -- common/autotest_common.sh@10 -- # set +x 00:04:20.684 ************************************ 00:04:20.684 END TEST hugepages 00:04:20.684 ************************************ 00:04:20.944 12:04:07 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:20.944 12:04:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:20.944 12:04:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:20.944 12:04:07 -- common/autotest_common.sh@10 -- # set +x 00:04:20.944 ************************************ 00:04:20.944 START TEST driver 00:04:20.944 ************************************ 00:04:20.944 12:04:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:20.944 * Looking for test storage... 00:04:20.944 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:20.944 12:04:07 -- setup/driver.sh@68 -- # setup reset 00:04:20.944 12:04:07 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:20.944 12:04:07 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.226 12:04:12 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:26.226 12:04:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:26.226 12:04:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:26.226 12:04:12 -- common/autotest_common.sh@10 -- # set +x 00:04:26.226 ************************************ 00:04:26.226 START TEST guess_driver 00:04:26.226 ************************************ 00:04:26.226 12:04:12 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:26.226 12:04:12 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:26.226 12:04:12 -- setup/driver.sh@47 -- # local fail=0 00:04:26.226 12:04:12 -- setup/driver.sh@49 -- # pick_driver 00:04:26.226 12:04:12 -- setup/driver.sh@36 -- # vfio 00:04:26.226 12:04:12 -- setup/driver.sh@21 -- # local iommu_grups 00:04:26.226 12:04:12 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:26.226 12:04:12 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:26.226 12:04:12 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:26.226 12:04:12 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:26.226 12:04:12 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:26.226 12:04:12 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:26.226 12:04:12 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:26.226 12:04:12 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:26.226 12:04:12 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:26.226 12:04:12 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:26.226 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:26.226 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:26.226 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:26.226 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:26.226 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:26.226 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:26.226 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:26.226 12:04:12 -- setup/driver.sh@30 -- # return 0 00:04:26.226 12:04:12 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:26.226 12:04:12 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:26.226 12:04:12 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:26.226 12:04:12 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:26.226 Looking for driver=vfio-pci 00:04:26.226 12:04:12 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:26.226 12:04:12 -- setup/driver.sh@45 -- # setup output config 00:04:26.226 12:04:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.226 12:04:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:29.517 12:04:15 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:15 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:15 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:15 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:15 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:15 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:15 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:15 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:15 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:15 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:15 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:15 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:15 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:15 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:15 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:15 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:15 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:15 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:15 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:15 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:15 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.517 12:04:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.517 12:04:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.517 12:04:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.936 12:04:17 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:30.936 12:04:17 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:30.936 12:04:17 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:30.936 12:04:17 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:30.936 12:04:17 -- setup/driver.sh@65 -- # setup reset 00:04:30.936 12:04:17 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:30.936 12:04:17 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:36.205 00:04:36.205 real 0m9.801s 00:04:36.205 user 0m2.641s 00:04:36.205 sys 0m4.988s 00:04:36.205 12:04:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.205 12:04:22 -- common/autotest_common.sh@10 -- # set +x 00:04:36.205 ************************************ 00:04:36.205 END TEST guess_driver 00:04:36.205 ************************************ 00:04:36.205 00:04:36.205 real 0m14.810s 00:04:36.205 user 0m4.070s 00:04:36.205 sys 0m7.804s 00:04:36.205 12:04:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.205 12:04:22 -- common/autotest_common.sh@10 -- # set +x 00:04:36.205 ************************************ 00:04:36.205 END TEST driver 00:04:36.205 ************************************ 00:04:36.205 12:04:22 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:36.205 12:04:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:36.205 12:04:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:36.205 12:04:22 -- common/autotest_common.sh@10 -- # set +x 00:04:36.205 ************************************ 00:04:36.205 START TEST devices 00:04:36.205 ************************************ 00:04:36.205 12:04:22 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:36.205 * Looking for test storage... 00:04:36.205 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:36.205 12:04:22 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:36.205 12:04:22 -- setup/devices.sh@192 -- # setup reset 00:04:36.205 12:04:22 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:36.205 12:04:22 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:39.513 12:04:26 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:39.513 12:04:26 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:39.513 12:04:26 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:39.513 12:04:26 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:39.513 12:04:26 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:39.513 12:04:26 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:39.513 12:04:26 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:39.513 12:04:26 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:39.513 12:04:26 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:39.513 12:04:26 -- setup/devices.sh@196 -- # blocks=() 00:04:39.513 12:04:26 -- setup/devices.sh@196 -- # declare -a blocks 00:04:39.513 12:04:26 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:39.513 12:04:26 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:39.513 12:04:26 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:39.513 12:04:26 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:39.513 12:04:26 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:39.513 12:04:26 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:39.513 12:04:26 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:39.513 12:04:26 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:39.513 12:04:26 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:39.513 12:04:26 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:39.513 12:04:26 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:39.513 No valid GPT data, bailing 00:04:39.513 12:04:26 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:39.513 12:04:26 -- scripts/common.sh@393 -- # pt= 00:04:39.513 12:04:26 -- scripts/common.sh@394 -- # return 1 00:04:39.513 12:04:26 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:39.513 12:04:26 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:39.513 12:04:26 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:39.513 12:04:26 -- setup/common.sh@80 -- # echo 1600321314816 00:04:39.513 12:04:26 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:39.513 12:04:26 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:39.513 12:04:26 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:39.513 12:04:26 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:39.513 12:04:26 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:39.513 12:04:26 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:39.513 12:04:26 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:39.513 12:04:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:39.513 12:04:26 -- common/autotest_common.sh@10 -- # set +x 00:04:39.513 ************************************ 00:04:39.513 START TEST nvme_mount 00:04:39.513 ************************************ 00:04:39.513 12:04:26 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:39.513 12:04:26 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:39.513 12:04:26 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:39.513 12:04:26 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.513 12:04:26 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.513 12:04:26 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:39.513 12:04:26 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:39.513 12:04:26 -- setup/common.sh@40 -- # local part_no=1 00:04:39.513 12:04:26 -- setup/common.sh@41 -- # local size=1073741824 00:04:39.513 12:04:26 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:39.513 12:04:26 -- setup/common.sh@44 -- # parts=() 00:04:39.513 12:04:26 -- setup/common.sh@44 -- # local parts 00:04:39.513 12:04:26 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:39.513 12:04:26 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:39.513 12:04:26 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:39.513 12:04:26 -- setup/common.sh@46 -- # (( part++ )) 00:04:39.513 12:04:26 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:39.513 12:04:26 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:39.513 12:04:26 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:39.513 12:04:26 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:40.889 Creating new GPT entries in memory. 00:04:40.889 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:40.889 other utilities. 00:04:40.889 12:04:27 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:40.889 12:04:27 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:40.889 12:04:27 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:40.889 12:04:27 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:40.889 12:04:27 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:41.824 Creating new GPT entries in memory. 00:04:41.824 The operation has completed successfully. 00:04:41.824 12:04:28 -- setup/common.sh@57 -- # (( part++ )) 00:04:41.824 12:04:28 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:41.824 12:04:28 -- setup/common.sh@62 -- # wait 1104384 00:04:41.824 12:04:28 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.825 12:04:28 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:41.825 12:04:28 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.825 12:04:28 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:41.825 12:04:28 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:41.825 12:04:28 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.825 12:04:28 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:41.825 12:04:28 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:41.825 12:04:28 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:41.825 12:04:28 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.825 12:04:28 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:41.825 12:04:28 -- setup/devices.sh@53 -- # local found=0 00:04:41.825 12:04:28 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:41.825 12:04:28 -- setup/devices.sh@56 -- # : 00:04:41.825 12:04:28 -- setup/devices.sh@59 -- # local pci status 00:04:41.825 12:04:28 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.825 12:04:28 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:41.825 12:04:28 -- setup/devices.sh@47 -- # setup output config 00:04:41.825 12:04:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.825 12:04:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:45.108 12:04:31 -- setup/devices.sh@63 -- # found=1 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.108 12:04:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.108 12:04:32 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:45.108 12:04:32 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:45.108 12:04:32 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.108 12:04:32 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:45.108 12:04:32 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:45.108 12:04:32 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:45.108 12:04:32 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.108 12:04:32 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.108 12:04:32 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:45.108 12:04:32 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:45.108 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:45.108 12:04:32 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:45.108 12:04:32 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:45.366 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:45.366 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:45.366 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:45.366 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:45.366 12:04:32 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:45.366 12:04:32 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:45.366 12:04:32 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.366 12:04:32 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:45.366 12:04:32 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:45.625 12:04:32 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.625 12:04:32 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:45.625 12:04:32 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:45.625 12:04:32 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:45.625 12:04:32 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.625 12:04:32 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:45.625 12:04:32 -- setup/devices.sh@53 -- # local found=0 00:04:45.625 12:04:32 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:45.625 12:04:32 -- setup/devices.sh@56 -- # : 00:04:45.625 12:04:32 -- setup/devices.sh@59 -- # local pci status 00:04:45.625 12:04:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.625 12:04:32 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:45.625 12:04:32 -- setup/devices.sh@47 -- # setup output config 00:04:45.625 12:04:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:45.625 12:04:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:48.906 12:04:35 -- setup/devices.sh@63 -- # found=1 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.906 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.906 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:48.907 12:04:35 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:48.907 12:04:35 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:48.907 12:04:35 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:48.907 12:04:35 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:48.907 12:04:35 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:48.907 12:04:35 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:48.907 12:04:35 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:48.907 12:04:35 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:48.907 12:04:35 -- setup/devices.sh@50 -- # local mount_point= 00:04:48.907 12:04:35 -- setup/devices.sh@51 -- # local test_file= 00:04:48.907 12:04:35 -- setup/devices.sh@53 -- # local found=0 00:04:48.907 12:04:35 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:48.907 12:04:35 -- setup/devices.sh@59 -- # local pci status 00:04:48.907 12:04:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:48.907 12:04:35 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:48.907 12:04:35 -- setup/devices.sh@47 -- # setup output config 00:04:48.907 12:04:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:48.907 12:04:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:52.189 12:04:38 -- setup/devices.sh@63 -- # found=1 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.189 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.189 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.190 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.190 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.190 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.190 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.190 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.190 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.190 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.190 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.190 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.190 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.190 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.190 12:04:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.190 12:04:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.190 12:04:39 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:52.190 12:04:39 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:52.190 12:04:39 -- setup/devices.sh@68 -- # return 0 00:04:52.190 12:04:39 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:52.190 12:04:39 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.190 12:04:39 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:52.190 12:04:39 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:52.190 12:04:39 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:52.190 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:52.190 00:04:52.190 real 0m12.609s 00:04:52.190 user 0m3.698s 00:04:52.190 sys 0m6.853s 00:04:52.190 12:04:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:52.190 12:04:39 -- common/autotest_common.sh@10 -- # set +x 00:04:52.190 ************************************ 00:04:52.190 END TEST nvme_mount 00:04:52.190 ************************************ 00:04:52.190 12:04:39 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:52.190 12:04:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:52.190 12:04:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:52.190 12:04:39 -- common/autotest_common.sh@10 -- # set +x 00:04:52.190 ************************************ 00:04:52.190 START TEST dm_mount 00:04:52.190 ************************************ 00:04:52.190 12:04:39 -- common/autotest_common.sh@1104 -- # dm_mount 00:04:52.190 12:04:39 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:52.190 12:04:39 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:52.190 12:04:39 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:52.190 12:04:39 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:52.190 12:04:39 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:52.190 12:04:39 -- setup/common.sh@40 -- # local part_no=2 00:04:52.190 12:04:39 -- setup/common.sh@41 -- # local size=1073741824 00:04:52.190 12:04:39 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:52.190 12:04:39 -- setup/common.sh@44 -- # parts=() 00:04:52.190 12:04:39 -- setup/common.sh@44 -- # local parts 00:04:52.190 12:04:39 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:52.190 12:04:39 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:52.190 12:04:39 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:52.190 12:04:39 -- setup/common.sh@46 -- # (( part++ )) 00:04:52.190 12:04:39 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:52.190 12:04:39 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:52.190 12:04:39 -- setup/common.sh@46 -- # (( part++ )) 00:04:52.190 12:04:39 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:52.190 12:04:39 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:52.190 12:04:39 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:52.190 12:04:39 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:53.579 Creating new GPT entries in memory. 00:04:53.579 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:53.579 other utilities. 00:04:53.579 12:04:40 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:53.579 12:04:40 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:53.579 12:04:40 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:53.579 12:04:40 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:53.579 12:04:40 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:54.516 Creating new GPT entries in memory. 00:04:54.516 The operation has completed successfully. 00:04:54.516 12:04:41 -- setup/common.sh@57 -- # (( part++ )) 00:04:54.516 12:04:41 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.516 12:04:41 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:54.516 12:04:41 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:54.516 12:04:41 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:55.451 The operation has completed successfully. 00:04:55.451 12:04:42 -- setup/common.sh@57 -- # (( part++ )) 00:04:55.451 12:04:42 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:55.451 12:04:42 -- setup/common.sh@62 -- # wait 1108890 00:04:55.451 12:04:42 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:55.451 12:04:42 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:55.451 12:04:42 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:55.451 12:04:42 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:55.451 12:04:42 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:55.451 12:04:42 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:55.451 12:04:42 -- setup/devices.sh@161 -- # break 00:04:55.451 12:04:42 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:55.451 12:04:42 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:55.451 12:04:42 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:55.451 12:04:42 -- setup/devices.sh@166 -- # dm=dm-0 00:04:55.451 12:04:42 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:55.451 12:04:42 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:55.451 12:04:42 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:55.451 12:04:42 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:55.451 12:04:42 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:55.451 12:04:42 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:55.451 12:04:42 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:55.451 12:04:42 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:55.451 12:04:42 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:55.451 12:04:42 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:55.451 12:04:42 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:55.451 12:04:42 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:55.451 12:04:42 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:55.451 12:04:42 -- setup/devices.sh@53 -- # local found=0 00:04:55.451 12:04:42 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:55.451 12:04:42 -- setup/devices.sh@56 -- # : 00:04:55.451 12:04:42 -- setup/devices.sh@59 -- # local pci status 00:04:55.451 12:04:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:55.451 12:04:42 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:55.451 12:04:42 -- setup/devices.sh@47 -- # setup output config 00:04:55.451 12:04:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.451 12:04:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:58.736 12:04:45 -- setup/devices.sh@63 -- # found=1 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:58.736 12:04:45 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:58.736 12:04:45 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:58.736 12:04:45 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:58.736 12:04:45 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:58.736 12:04:45 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:58.736 12:04:45 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:58.736 12:04:45 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:58.736 12:04:45 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:58.736 12:04:45 -- setup/devices.sh@50 -- # local mount_point= 00:04:58.736 12:04:45 -- setup/devices.sh@51 -- # local test_file= 00:04:58.736 12:04:45 -- setup/devices.sh@53 -- # local found=0 00:04:58.736 12:04:45 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:58.736 12:04:45 -- setup/devices.sh@59 -- # local pci status 00:04:58.736 12:04:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.736 12:04:45 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:58.736 12:04:45 -- setup/devices.sh@47 -- # setup output config 00:04:58.736 12:04:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:58.736 12:04:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:02.023 12:04:48 -- setup/devices.sh@63 -- # found=1 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.023 12:04:48 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:02.023 12:04:48 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:02.023 12:04:48 -- setup/devices.sh@68 -- # return 0 00:05:02.023 12:04:48 -- setup/devices.sh@187 -- # cleanup_dm 00:05:02.023 12:04:48 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:02.023 12:04:48 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:02.023 12:04:48 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:02.023 12:04:48 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:02.023 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:02.023 12:04:48 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:02.023 00:05:02.023 real 0m9.771s 00:05:02.023 user 0m2.251s 00:05:02.023 sys 0m4.570s 00:05:02.023 12:04:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:02.023 12:04:48 -- common/autotest_common.sh@10 -- # set +x 00:05:02.023 ************************************ 00:05:02.023 END TEST dm_mount 00:05:02.023 ************************************ 00:05:02.023 12:04:48 -- setup/devices.sh@1 -- # cleanup 00:05:02.023 12:04:48 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:02.023 12:04:48 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.023 12:04:48 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:02.023 12:04:48 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:02.023 12:04:48 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:02.281 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:02.281 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:02.281 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:02.281 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:02.281 12:04:49 -- setup/devices.sh@12 -- # cleanup_dm 00:05:02.281 12:04:49 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:02.281 12:04:49 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:02.281 12:04:49 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:02.281 12:04:49 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:02.281 12:04:49 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:02.281 12:04:49 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:02.281 00:05:02.281 real 0m26.674s 00:05:02.281 user 0m7.413s 00:05:02.281 sys 0m14.177s 00:05:02.281 12:04:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:02.281 12:04:49 -- common/autotest_common.sh@10 -- # set +x 00:05:02.281 ************************************ 00:05:02.281 END TEST devices 00:05:02.281 ************************************ 00:05:02.281 00:05:02.281 real 1m31.995s 00:05:02.281 user 0m27.638s 00:05:02.281 sys 0m52.919s 00:05:02.281 12:04:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:02.281 12:04:49 -- common/autotest_common.sh@10 -- # set +x 00:05:02.281 ************************************ 00:05:02.281 END TEST setup.sh 00:05:02.281 ************************************ 00:05:02.539 12:04:49 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:05.836 Hugepages 00:05:05.836 node hugesize free / total 00:05:05.836 node0 1048576kB 0 / 0 00:05:05.836 node0 2048kB 2048 / 2048 00:05:05.836 node1 1048576kB 0 / 0 00:05:05.836 node1 2048kB 0 / 0 00:05:05.836 00:05:05.836 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:05.836 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:05.836 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:05.836 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:05.836 12:04:52 -- spdk/autotest.sh@141 -- # uname -s 00:05:05.836 12:04:52 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:05:05.836 12:04:52 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:05:05.836 12:04:52 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:09.127 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:09.127 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:11.031 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:11.031 12:04:57 -- common/autotest_common.sh@1517 -- # sleep 1 00:05:11.968 12:04:58 -- common/autotest_common.sh@1518 -- # bdfs=() 00:05:11.968 12:04:58 -- common/autotest_common.sh@1518 -- # local bdfs 00:05:11.968 12:04:58 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:05:11.968 12:04:58 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:05:11.968 12:04:58 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:11.968 12:04:58 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:11.968 12:04:58 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:11.968 12:04:58 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:11.968 12:04:58 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:11.968 12:04:58 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:11.968 12:04:58 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:11.968 12:04:58 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:15.251 Waiting for block devices as requested 00:05:15.251 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:15.251 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:15.251 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:15.509 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:15.509 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:15.509 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:15.768 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:15.768 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:15.768 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:15.768 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:16.027 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:16.027 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:16.027 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:16.286 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:16.286 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:16.286 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:16.548 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:16.909 12:05:03 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:05:16.909 12:05:03 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:05:16.909 12:05:03 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:16.909 12:05:03 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:05:16.909 12:05:03 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:05:16.909 12:05:03 -- common/autotest_common.sh@1530 -- # grep oacs 00:05:16.909 12:05:03 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:05:16.909 12:05:03 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:05:16.909 12:05:03 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:05:16.909 12:05:03 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:05:16.909 12:05:03 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:05:16.909 12:05:03 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:05:16.909 12:05:03 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:05:16.909 12:05:03 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:05:16.909 12:05:03 -- common/autotest_common.sh@1542 -- # continue 00:05:16.909 12:05:03 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:05:16.909 12:05:03 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:16.909 12:05:03 -- common/autotest_common.sh@10 -- # set +x 00:05:16.909 12:05:03 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:05:16.909 12:05:03 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:16.909 12:05:03 -- common/autotest_common.sh@10 -- # set +x 00:05:16.909 12:05:03 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:20.203 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:20.203 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:20.462 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:20.462 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:20.462 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:20.462 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:20.462 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:22.367 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:22.367 12:05:08 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:05:22.367 12:05:08 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:22.367 12:05:08 -- common/autotest_common.sh@10 -- # set +x 00:05:22.367 12:05:08 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:05:22.367 12:05:08 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:22.367 12:05:08 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:22.367 12:05:08 -- common/autotest_common.sh@1562 -- # bdfs=() 00:05:22.367 12:05:08 -- common/autotest_common.sh@1562 -- # local bdfs 00:05:22.367 12:05:08 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:22.367 12:05:08 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:22.367 12:05:08 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:22.367 12:05:08 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:22.367 12:05:08 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:22.367 12:05:08 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:22.367 12:05:09 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:22.367 12:05:09 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:22.367 12:05:09 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:05:22.367 12:05:09 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:22.367 12:05:09 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:05:22.367 12:05:09 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:22.367 12:05:09 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:05:22.367 12:05:09 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:22.367 12:05:09 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:22.367 12:05:09 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=1118837 00:05:22.367 12:05:09 -- common/autotest_common.sh@1583 -- # waitforlisten 1118837 00:05:22.367 12:05:09 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:22.367 12:05:09 -- common/autotest_common.sh@819 -- # '[' -z 1118837 ']' 00:05:22.367 12:05:09 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:22.368 12:05:09 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:22.368 12:05:09 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:22.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:22.368 12:05:09 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:22.368 12:05:09 -- common/autotest_common.sh@10 -- # set +x 00:05:22.368 [2024-11-02 12:05:09.144455] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:22.368 [2024-11-02 12:05:09.144518] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1118837 ] 00:05:22.368 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.368 [2024-11-02 12:05:09.211509] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.368 [2024-11-02 12:05:09.248989] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:22.368 [2024-11-02 12:05:09.249118] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.306 12:05:09 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:23.306 12:05:09 -- common/autotest_common.sh@852 -- # return 0 00:05:23.306 12:05:09 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:23.306 12:05:09 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:23.306 12:05:09 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:26.593 nvme0n1 00:05:26.593 12:05:12 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:26.593 [2024-11-02 12:05:13.151911] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:26.593 request: 00:05:26.593 { 00:05:26.593 "nvme_ctrlr_name": "nvme0", 00:05:26.593 "password": "test", 00:05:26.593 "method": "bdev_nvme_opal_revert", 00:05:26.593 "req_id": 1 00:05:26.593 } 00:05:26.593 Got JSON-RPC error response 00:05:26.593 response: 00:05:26.593 { 00:05:26.593 "code": -32602, 00:05:26.593 "message": "Invalid parameters" 00:05:26.593 } 00:05:26.593 12:05:13 -- common/autotest_common.sh@1589 -- # true 00:05:26.593 12:05:13 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:26.593 12:05:13 -- common/autotest_common.sh@1593 -- # killprocess 1118837 00:05:26.593 12:05:13 -- common/autotest_common.sh@926 -- # '[' -z 1118837 ']' 00:05:26.593 12:05:13 -- common/autotest_common.sh@930 -- # kill -0 1118837 00:05:26.593 12:05:13 -- common/autotest_common.sh@931 -- # uname 00:05:26.593 12:05:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:26.593 12:05:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1118837 00:05:26.593 12:05:13 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:26.593 12:05:13 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:26.593 12:05:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1118837' 00:05:26.593 killing process with pid 1118837 00:05:26.593 12:05:13 -- common/autotest_common.sh@945 -- # kill 1118837 00:05:26.593 12:05:13 -- common/autotest_common.sh@950 -- # wait 1118837 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.593 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:26.594 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.495 12:05:15 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:28.495 12:05:15 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:28.495 12:05:15 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:28.495 12:05:15 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:28.495 12:05:15 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:28.495 12:05:15 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:28.495 12:05:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.495 12:05:15 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:28.495 12:05:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:28.495 12:05:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:28.495 12:05:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.495 ************************************ 00:05:28.495 START TEST env 00:05:28.495 ************************************ 00:05:28.495 12:05:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:28.754 * Looking for test storage... 00:05:28.754 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:28.754 12:05:15 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:28.754 12:05:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:28.754 12:05:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:28.754 12:05:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.754 ************************************ 00:05:28.754 START TEST env_memory 00:05:28.754 ************************************ 00:05:28.754 12:05:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:28.754 00:05:28.754 00:05:28.754 CUnit - A unit testing framework for C - Version 2.1-3 00:05:28.754 http://cunit.sourceforge.net/ 00:05:28.754 00:05:28.754 00:05:28.754 Suite: memory 00:05:28.754 Test: alloc and free memory map ...[2024-11-02 12:05:15.562583] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:28.754 passed 00:05:28.754 Test: mem map translation ...[2024-11-02 12:05:15.575655] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:28.754 [2024-11-02 12:05:15.575671] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:28.754 [2024-11-02 12:05:15.575703] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:28.754 [2024-11-02 12:05:15.575712] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:28.754 passed 00:05:28.754 Test: mem map registration ...[2024-11-02 12:05:15.596442] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:28.754 [2024-11-02 12:05:15.596459] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:28.754 passed 00:05:28.754 Test: mem map adjacent registrations ...passed 00:05:28.754 00:05:28.754 Run Summary: Type Total Ran Passed Failed Inactive 00:05:28.754 suites 1 1 n/a 0 0 00:05:28.754 tests 4 4 4 0 0 00:05:28.754 asserts 152 152 152 0 n/a 00:05:28.754 00:05:28.754 Elapsed time = 0.084 seconds 00:05:28.754 00:05:28.754 real 0m0.098s 00:05:28.754 user 0m0.084s 00:05:28.754 sys 0m0.014s 00:05:28.754 12:05:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:28.754 12:05:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.754 ************************************ 00:05:28.754 END TEST env_memory 00:05:28.754 ************************************ 00:05:28.754 12:05:15 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:28.754 12:05:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:28.754 12:05:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:28.754 12:05:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.754 ************************************ 00:05:28.754 START TEST env_vtophys 00:05:28.754 ************************************ 00:05:28.754 12:05:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:28.754 EAL: lib.eal log level changed from notice to debug 00:05:28.754 EAL: Detected lcore 0 as core 0 on socket 0 00:05:28.754 EAL: Detected lcore 1 as core 1 on socket 0 00:05:28.754 EAL: Detected lcore 2 as core 2 on socket 0 00:05:28.754 EAL: Detected lcore 3 as core 3 on socket 0 00:05:28.754 EAL: Detected lcore 4 as core 4 on socket 0 00:05:28.754 EAL: Detected lcore 5 as core 5 on socket 0 00:05:28.754 EAL: Detected lcore 6 as core 6 on socket 0 00:05:28.754 EAL: Detected lcore 7 as core 8 on socket 0 00:05:28.754 EAL: Detected lcore 8 as core 9 on socket 0 00:05:28.754 EAL: Detected lcore 9 as core 10 on socket 0 00:05:28.754 EAL: Detected lcore 10 as core 11 on socket 0 00:05:28.754 EAL: Detected lcore 11 as core 12 on socket 0 00:05:28.754 EAL: Detected lcore 12 as core 13 on socket 0 00:05:28.754 EAL: Detected lcore 13 as core 14 on socket 0 00:05:28.754 EAL: Detected lcore 14 as core 16 on socket 0 00:05:28.754 EAL: Detected lcore 15 as core 17 on socket 0 00:05:28.754 EAL: Detected lcore 16 as core 18 on socket 0 00:05:28.754 EAL: Detected lcore 17 as core 19 on socket 0 00:05:28.754 EAL: Detected lcore 18 as core 20 on socket 0 00:05:28.754 EAL: Detected lcore 19 as core 21 on socket 0 00:05:28.754 EAL: Detected lcore 20 as core 22 on socket 0 00:05:28.754 EAL: Detected lcore 21 as core 24 on socket 0 00:05:28.754 EAL: Detected lcore 22 as core 25 on socket 0 00:05:28.754 EAL: Detected lcore 23 as core 26 on socket 0 00:05:28.754 EAL: Detected lcore 24 as core 27 on socket 0 00:05:28.754 EAL: Detected lcore 25 as core 28 on socket 0 00:05:28.754 EAL: Detected lcore 26 as core 29 on socket 0 00:05:28.754 EAL: Detected lcore 27 as core 30 on socket 0 00:05:28.754 EAL: Detected lcore 28 as core 0 on socket 1 00:05:28.754 EAL: Detected lcore 29 as core 1 on socket 1 00:05:28.754 EAL: Detected lcore 30 as core 2 on socket 1 00:05:28.754 EAL: Detected lcore 31 as core 3 on socket 1 00:05:28.754 EAL: Detected lcore 32 as core 4 on socket 1 00:05:28.754 EAL: Detected lcore 33 as core 5 on socket 1 00:05:28.754 EAL: Detected lcore 34 as core 6 on socket 1 00:05:28.754 EAL: Detected lcore 35 as core 8 on socket 1 00:05:28.754 EAL: Detected lcore 36 as core 9 on socket 1 00:05:28.754 EAL: Detected lcore 37 as core 10 on socket 1 00:05:28.754 EAL: Detected lcore 38 as core 11 on socket 1 00:05:28.754 EAL: Detected lcore 39 as core 12 on socket 1 00:05:28.754 EAL: Detected lcore 40 as core 13 on socket 1 00:05:28.754 EAL: Detected lcore 41 as core 14 on socket 1 00:05:28.754 EAL: Detected lcore 42 as core 16 on socket 1 00:05:28.754 EAL: Detected lcore 43 as core 17 on socket 1 00:05:28.754 EAL: Detected lcore 44 as core 18 on socket 1 00:05:28.754 EAL: Detected lcore 45 as core 19 on socket 1 00:05:28.754 EAL: Detected lcore 46 as core 20 on socket 1 00:05:28.754 EAL: Detected lcore 47 as core 21 on socket 1 00:05:28.754 EAL: Detected lcore 48 as core 22 on socket 1 00:05:28.754 EAL: Detected lcore 49 as core 24 on socket 1 00:05:28.754 EAL: Detected lcore 50 as core 25 on socket 1 00:05:28.754 EAL: Detected lcore 51 as core 26 on socket 1 00:05:28.754 EAL: Detected lcore 52 as core 27 on socket 1 00:05:28.754 EAL: Detected lcore 53 as core 28 on socket 1 00:05:28.754 EAL: Detected lcore 54 as core 29 on socket 1 00:05:28.754 EAL: Detected lcore 55 as core 30 on socket 1 00:05:28.754 EAL: Detected lcore 56 as core 0 on socket 0 00:05:28.754 EAL: Detected lcore 57 as core 1 on socket 0 00:05:28.754 EAL: Detected lcore 58 as core 2 on socket 0 00:05:28.754 EAL: Detected lcore 59 as core 3 on socket 0 00:05:28.754 EAL: Detected lcore 60 as core 4 on socket 0 00:05:28.754 EAL: Detected lcore 61 as core 5 on socket 0 00:05:28.754 EAL: Detected lcore 62 as core 6 on socket 0 00:05:28.754 EAL: Detected lcore 63 as core 8 on socket 0 00:05:28.754 EAL: Detected lcore 64 as core 9 on socket 0 00:05:28.754 EAL: Detected lcore 65 as core 10 on socket 0 00:05:28.754 EAL: Detected lcore 66 as core 11 on socket 0 00:05:28.754 EAL: Detected lcore 67 as core 12 on socket 0 00:05:28.754 EAL: Detected lcore 68 as core 13 on socket 0 00:05:28.754 EAL: Detected lcore 69 as core 14 on socket 0 00:05:28.754 EAL: Detected lcore 70 as core 16 on socket 0 00:05:28.754 EAL: Detected lcore 71 as core 17 on socket 0 00:05:28.754 EAL: Detected lcore 72 as core 18 on socket 0 00:05:28.754 EAL: Detected lcore 73 as core 19 on socket 0 00:05:28.754 EAL: Detected lcore 74 as core 20 on socket 0 00:05:28.754 EAL: Detected lcore 75 as core 21 on socket 0 00:05:28.754 EAL: Detected lcore 76 as core 22 on socket 0 00:05:28.754 EAL: Detected lcore 77 as core 24 on socket 0 00:05:28.754 EAL: Detected lcore 78 as core 25 on socket 0 00:05:28.754 EAL: Detected lcore 79 as core 26 on socket 0 00:05:28.754 EAL: Detected lcore 80 as core 27 on socket 0 00:05:28.754 EAL: Detected lcore 81 as core 28 on socket 0 00:05:28.754 EAL: Detected lcore 82 as core 29 on socket 0 00:05:28.754 EAL: Detected lcore 83 as core 30 on socket 0 00:05:28.754 EAL: Detected lcore 84 as core 0 on socket 1 00:05:28.755 EAL: Detected lcore 85 as core 1 on socket 1 00:05:28.755 EAL: Detected lcore 86 as core 2 on socket 1 00:05:28.755 EAL: Detected lcore 87 as core 3 on socket 1 00:05:28.755 EAL: Detected lcore 88 as core 4 on socket 1 00:05:28.755 EAL: Detected lcore 89 as core 5 on socket 1 00:05:28.755 EAL: Detected lcore 90 as core 6 on socket 1 00:05:28.755 EAL: Detected lcore 91 as core 8 on socket 1 00:05:28.755 EAL: Detected lcore 92 as core 9 on socket 1 00:05:28.755 EAL: Detected lcore 93 as core 10 on socket 1 00:05:28.755 EAL: Detected lcore 94 as core 11 on socket 1 00:05:28.755 EAL: Detected lcore 95 as core 12 on socket 1 00:05:28.755 EAL: Detected lcore 96 as core 13 on socket 1 00:05:28.755 EAL: Detected lcore 97 as core 14 on socket 1 00:05:28.755 EAL: Detected lcore 98 as core 16 on socket 1 00:05:28.755 EAL: Detected lcore 99 as core 17 on socket 1 00:05:28.755 EAL: Detected lcore 100 as core 18 on socket 1 00:05:28.755 EAL: Detected lcore 101 as core 19 on socket 1 00:05:28.755 EAL: Detected lcore 102 as core 20 on socket 1 00:05:28.755 EAL: Detected lcore 103 as core 21 on socket 1 00:05:28.755 EAL: Detected lcore 104 as core 22 on socket 1 00:05:28.755 EAL: Detected lcore 105 as core 24 on socket 1 00:05:28.755 EAL: Detected lcore 106 as core 25 on socket 1 00:05:28.755 EAL: Detected lcore 107 as core 26 on socket 1 00:05:28.755 EAL: Detected lcore 108 as core 27 on socket 1 00:05:28.755 EAL: Detected lcore 109 as core 28 on socket 1 00:05:28.755 EAL: Detected lcore 110 as core 29 on socket 1 00:05:28.755 EAL: Detected lcore 111 as core 30 on socket 1 00:05:28.755 EAL: Maximum logical cores by configuration: 128 00:05:28.755 EAL: Detected CPU lcores: 112 00:05:28.755 EAL: Detected NUMA nodes: 2 00:05:28.755 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:28.755 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:28.755 EAL: Checking presence of .so 'librte_eal.so' 00:05:28.755 EAL: Detected static linkage of DPDK 00:05:28.755 EAL: No shared files mode enabled, IPC will be disabled 00:05:28.755 EAL: Bus pci wants IOVA as 'DC' 00:05:28.755 EAL: Buses did not request a specific IOVA mode. 00:05:28.755 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:28.755 EAL: Selected IOVA mode 'VA' 00:05:28.755 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.755 EAL: Probing VFIO support... 00:05:28.755 EAL: IOMMU type 1 (Type 1) is supported 00:05:28.755 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:28.755 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:28.755 EAL: VFIO support initialized 00:05:28.755 EAL: Ask a virtual area of 0x2e000 bytes 00:05:28.755 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:28.755 EAL: Setting up physically contiguous memory... 00:05:28.755 EAL: Setting maximum number of open files to 524288 00:05:28.755 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:28.755 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:28.755 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:28.755 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:28.755 EAL: Ask a virtual area of 0x61000 bytes 00:05:28.755 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:28.755 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:28.755 EAL: Ask a virtual area of 0x400000000 bytes 00:05:28.755 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:28.755 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:28.755 EAL: Hugepages will be freed exactly as allocated. 00:05:28.755 EAL: No shared files mode enabled, IPC is disabled 00:05:28.755 EAL: No shared files mode enabled, IPC is disabled 00:05:28.755 EAL: TSC frequency is ~2500000 KHz 00:05:28.755 EAL: Main lcore 0 is ready (tid=7f5ff78fba00;cpuset=[0]) 00:05:28.755 EAL: Trying to obtain current memory policy. 00:05:28.755 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.755 EAL: Restoring previous memory policy: 0 00:05:28.755 EAL: request: mp_malloc_sync 00:05:28.755 EAL: No shared files mode enabled, IPC is disabled 00:05:28.755 EAL: Heap on socket 0 was expanded by 2MB 00:05:28.755 EAL: No shared files mode enabled, IPC is disabled 00:05:29.013 EAL: Mem event callback 'spdk:(nil)' registered 00:05:29.013 00:05:29.013 00:05:29.014 CUnit - A unit testing framework for C - Version 2.1-3 00:05:29.014 http://cunit.sourceforge.net/ 00:05:29.014 00:05:29.014 00:05:29.014 Suite: components_suite 00:05:29.014 Test: vtophys_malloc_test ...passed 00:05:29.014 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 4MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 4MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 6MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 6MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 10MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 10MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 18MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 18MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 34MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 34MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 66MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 66MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 130MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 130MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.014 EAL: Restoring previous memory policy: 4 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was expanded by 258MB 00:05:29.014 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.014 EAL: request: mp_malloc_sync 00:05:29.014 EAL: No shared files mode enabled, IPC is disabled 00:05:29.014 EAL: Heap on socket 0 was shrunk by 258MB 00:05:29.014 EAL: Trying to obtain current memory policy. 00:05:29.014 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.272 EAL: Restoring previous memory policy: 4 00:05:29.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.272 EAL: request: mp_malloc_sync 00:05:29.272 EAL: No shared files mode enabled, IPC is disabled 00:05:29.272 EAL: Heap on socket 0 was expanded by 514MB 00:05:29.272 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.272 EAL: request: mp_malloc_sync 00:05:29.272 EAL: No shared files mode enabled, IPC is disabled 00:05:29.272 EAL: Heap on socket 0 was shrunk by 514MB 00:05:29.272 EAL: Trying to obtain current memory policy. 00:05:29.272 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.530 EAL: Restoring previous memory policy: 4 00:05:29.530 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.530 EAL: request: mp_malloc_sync 00:05:29.530 EAL: No shared files mode enabled, IPC is disabled 00:05:29.530 EAL: Heap on socket 0 was expanded by 1026MB 00:05:29.788 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.788 EAL: request: mp_malloc_sync 00:05:29.788 EAL: No shared files mode enabled, IPC is disabled 00:05:29.788 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:29.788 passed 00:05:29.788 00:05:29.788 Run Summary: Type Total Ran Passed Failed Inactive 00:05:29.788 suites 1 1 n/a 0 0 00:05:29.788 tests 2 2 2 0 0 00:05:29.788 asserts 497 497 497 0 n/a 00:05:29.788 00:05:29.788 Elapsed time = 0.961 seconds 00:05:29.788 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.788 EAL: request: mp_malloc_sync 00:05:29.788 EAL: No shared files mode enabled, IPC is disabled 00:05:29.788 EAL: Heap on socket 0 was shrunk by 2MB 00:05:29.788 EAL: No shared files mode enabled, IPC is disabled 00:05:29.789 EAL: No shared files mode enabled, IPC is disabled 00:05:29.789 EAL: No shared files mode enabled, IPC is disabled 00:05:29.789 00:05:29.789 real 0m1.076s 00:05:29.789 user 0m0.620s 00:05:29.789 sys 0m0.431s 00:05:29.789 12:05:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.789 12:05:16 -- common/autotest_common.sh@10 -- # set +x 00:05:29.789 ************************************ 00:05:29.789 END TEST env_vtophys 00:05:29.789 ************************************ 00:05:30.047 12:05:16 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.047 12:05:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:30.047 12:05:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.047 12:05:16 -- common/autotest_common.sh@10 -- # set +x 00:05:30.047 ************************************ 00:05:30.047 START TEST env_pci 00:05:30.047 ************************************ 00:05:30.047 12:05:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:30.047 00:05:30.047 00:05:30.047 CUnit - A unit testing framework for C - Version 2.1-3 00:05:30.047 http://cunit.sourceforge.net/ 00:05:30.047 00:05:30.047 00:05:30.047 Suite: pci 00:05:30.047 Test: pci_hook ...[2024-11-02 12:05:16.810631] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1120251 has claimed it 00:05:30.047 EAL: Cannot find device (10000:00:01.0) 00:05:30.047 EAL: Failed to attach device on primary process 00:05:30.047 passed 00:05:30.047 00:05:30.047 Run Summary: Type Total Ran Passed Failed Inactive 00:05:30.047 suites 1 1 n/a 0 0 00:05:30.047 tests 1 1 1 0 0 00:05:30.047 asserts 25 25 25 0 n/a 00:05:30.047 00:05:30.047 Elapsed time = 0.037 seconds 00:05:30.047 00:05:30.047 real 0m0.055s 00:05:30.047 user 0m0.015s 00:05:30.047 sys 0m0.040s 00:05:30.047 12:05:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.047 12:05:16 -- common/autotest_common.sh@10 -- # set +x 00:05:30.047 ************************************ 00:05:30.047 END TEST env_pci 00:05:30.047 ************************************ 00:05:30.047 12:05:16 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:30.047 12:05:16 -- env/env.sh@15 -- # uname 00:05:30.047 12:05:16 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:30.047 12:05:16 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:30.047 12:05:16 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.047 12:05:16 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:30.047 12:05:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:30.047 12:05:16 -- common/autotest_common.sh@10 -- # set +x 00:05:30.047 ************************************ 00:05:30.047 START TEST env_dpdk_post_init 00:05:30.047 ************************************ 00:05:30.047 12:05:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:30.047 EAL: Detected CPU lcores: 112 00:05:30.047 EAL: Detected NUMA nodes: 2 00:05:30.047 EAL: Detected static linkage of DPDK 00:05:30.047 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:30.047 EAL: Selected IOVA mode 'VA' 00:05:30.047 EAL: No free 2048 kB hugepages reported on node 1 00:05:30.047 EAL: VFIO support initialized 00:05:30.047 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:30.047 EAL: Using IOMMU type 1 (Type 1) 00:05:30.984 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:35.168 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:35.168 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:35.168 Starting DPDK initialization... 00:05:35.168 Starting SPDK post initialization... 00:05:35.168 SPDK NVMe probe 00:05:35.168 Attaching to 0000:d8:00.0 00:05:35.169 Attached to 0000:d8:00.0 00:05:35.169 Cleaning up... 00:05:35.169 00:05:35.169 real 0m4.730s 00:05:35.169 user 0m3.577s 00:05:35.169 sys 0m0.395s 00:05:35.169 12:05:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.169 12:05:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.169 ************************************ 00:05:35.169 END TEST env_dpdk_post_init 00:05:35.169 ************************************ 00:05:35.169 12:05:21 -- env/env.sh@26 -- # uname 00:05:35.169 12:05:21 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:35.169 12:05:21 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:35.169 12:05:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.169 12:05:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.169 12:05:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.169 ************************************ 00:05:35.169 START TEST env_mem_callbacks 00:05:35.169 ************************************ 00:05:35.169 12:05:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:35.169 EAL: Detected CPU lcores: 112 00:05:35.169 EAL: Detected NUMA nodes: 2 00:05:35.169 EAL: Detected static linkage of DPDK 00:05:35.169 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:35.169 EAL: Selected IOVA mode 'VA' 00:05:35.169 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.169 EAL: VFIO support initialized 00:05:35.169 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:35.169 00:05:35.169 00:05:35.169 CUnit - A unit testing framework for C - Version 2.1-3 00:05:35.169 http://cunit.sourceforge.net/ 00:05:35.169 00:05:35.169 00:05:35.169 Suite: memory 00:05:35.169 Test: test ... 00:05:35.169 register 0x200000200000 2097152 00:05:35.169 malloc 3145728 00:05:35.169 register 0x200000400000 4194304 00:05:35.169 buf 0x200000500000 len 3145728 PASSED 00:05:35.169 malloc 64 00:05:35.169 buf 0x2000004fff40 len 64 PASSED 00:05:35.169 malloc 4194304 00:05:35.169 register 0x200000800000 6291456 00:05:35.169 buf 0x200000a00000 len 4194304 PASSED 00:05:35.169 free 0x200000500000 3145728 00:05:35.169 free 0x2000004fff40 64 00:05:35.169 unregister 0x200000400000 4194304 PASSED 00:05:35.169 free 0x200000a00000 4194304 00:05:35.169 unregister 0x200000800000 6291456 PASSED 00:05:35.169 malloc 8388608 00:05:35.169 register 0x200000400000 10485760 00:05:35.169 buf 0x200000600000 len 8388608 PASSED 00:05:35.169 free 0x200000600000 8388608 00:05:35.169 unregister 0x200000400000 10485760 PASSED 00:05:35.169 passed 00:05:35.169 00:05:35.169 Run Summary: Type Total Ran Passed Failed Inactive 00:05:35.169 suites 1 1 n/a 0 0 00:05:35.169 tests 1 1 1 0 0 00:05:35.169 asserts 15 15 15 0 n/a 00:05:35.169 00:05:35.169 Elapsed time = 0.005 seconds 00:05:35.169 00:05:35.169 real 0m0.062s 00:05:35.169 user 0m0.015s 00:05:35.169 sys 0m0.047s 00:05:35.169 12:05:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.169 12:05:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.169 ************************************ 00:05:35.169 END TEST env_mem_callbacks 00:05:35.169 ************************************ 00:05:35.169 00:05:35.169 real 0m6.372s 00:05:35.169 user 0m4.424s 00:05:35.169 sys 0m1.212s 00:05:35.169 12:05:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:35.169 12:05:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.169 ************************************ 00:05:35.169 END TEST env 00:05:35.169 ************************************ 00:05:35.169 12:05:21 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:35.169 12:05:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:35.169 12:05:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:35.169 12:05:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.169 ************************************ 00:05:35.169 START TEST rpc 00:05:35.169 ************************************ 00:05:35.169 12:05:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:35.169 * Looking for test storage... 00:05:35.169 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:35.169 12:05:21 -- rpc/rpc.sh@65 -- # spdk_pid=1121311 00:05:35.169 12:05:21 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:35.169 12:05:21 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:35.169 12:05:21 -- rpc/rpc.sh@67 -- # waitforlisten 1121311 00:05:35.169 12:05:21 -- common/autotest_common.sh@819 -- # '[' -z 1121311 ']' 00:05:35.169 12:05:21 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.169 12:05:21 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:35.169 12:05:21 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.169 12:05:21 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:35.169 12:05:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.169 [2024-11-02 12:05:21.972812] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:35.169 [2024-11-02 12:05:21.972888] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1121311 ] 00:05:35.169 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.169 [2024-11-02 12:05:22.040172] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.169 [2024-11-02 12:05:22.078505] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:35.169 [2024-11-02 12:05:22.078611] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:35.169 [2024-11-02 12:05:22.078621] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1121311' to capture a snapshot of events at runtime. 00:05:35.169 [2024-11-02 12:05:22.078630] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1121311 for offline analysis/debug. 00:05:35.169 [2024-11-02 12:05:22.078652] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.103 12:05:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:36.103 12:05:22 -- common/autotest_common.sh@852 -- # return 0 00:05:36.103 12:05:22 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:36.103 12:05:22 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:36.103 12:05:22 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:36.103 12:05:22 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:36.103 12:05:22 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.103 12:05:22 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.103 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.103 ************************************ 00:05:36.103 START TEST rpc_integrity 00:05:36.103 ************************************ 00:05:36.103 12:05:22 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:36.103 12:05:22 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:36.103 12:05:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.103 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.103 12:05:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.103 12:05:22 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:36.103 12:05:22 -- rpc/rpc.sh@13 -- # jq length 00:05:36.103 12:05:22 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:36.104 12:05:22 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:36.104 12:05:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.104 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 12:05:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.104 12:05:22 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:36.104 12:05:22 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:36.104 12:05:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.104 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 12:05:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.104 12:05:22 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:36.104 { 00:05:36.104 "name": "Malloc0", 00:05:36.104 "aliases": [ 00:05:36.104 "cd9d0a99-563e-4e69-b81a-a1e77f428293" 00:05:36.104 ], 00:05:36.104 "product_name": "Malloc disk", 00:05:36.104 "block_size": 512, 00:05:36.104 "num_blocks": 16384, 00:05:36.104 "uuid": "cd9d0a99-563e-4e69-b81a-a1e77f428293", 00:05:36.104 "assigned_rate_limits": { 00:05:36.104 "rw_ios_per_sec": 0, 00:05:36.104 "rw_mbytes_per_sec": 0, 00:05:36.104 "r_mbytes_per_sec": 0, 00:05:36.104 "w_mbytes_per_sec": 0 00:05:36.104 }, 00:05:36.104 "claimed": false, 00:05:36.104 "zoned": false, 00:05:36.104 "supported_io_types": { 00:05:36.104 "read": true, 00:05:36.104 "write": true, 00:05:36.104 "unmap": true, 00:05:36.104 "write_zeroes": true, 00:05:36.104 "flush": true, 00:05:36.104 "reset": true, 00:05:36.104 "compare": false, 00:05:36.104 "compare_and_write": false, 00:05:36.104 "abort": true, 00:05:36.104 "nvme_admin": false, 00:05:36.104 "nvme_io": false 00:05:36.104 }, 00:05:36.104 "memory_domains": [ 00:05:36.104 { 00:05:36.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.104 "dma_device_type": 2 00:05:36.104 } 00:05:36.104 ], 00:05:36.104 "driver_specific": {} 00:05:36.104 } 00:05:36.104 ]' 00:05:36.104 12:05:22 -- rpc/rpc.sh@17 -- # jq length 00:05:36.104 12:05:22 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:36.104 12:05:22 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:36.104 12:05:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.104 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 [2024-11-02 12:05:22.924396] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:36.104 [2024-11-02 12:05:22.924434] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:36.104 [2024-11-02 12:05:22.924454] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4ee2850 00:05:36.104 [2024-11-02 12:05:22.924464] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:36.104 [2024-11-02 12:05:22.925290] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:36.104 [2024-11-02 12:05:22.925313] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:36.104 Passthru0 00:05:36.104 12:05:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.104 12:05:22 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:36.104 12:05:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.104 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 12:05:22 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.104 12:05:22 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:36.104 { 00:05:36.104 "name": "Malloc0", 00:05:36.104 "aliases": [ 00:05:36.104 "cd9d0a99-563e-4e69-b81a-a1e77f428293" 00:05:36.104 ], 00:05:36.104 "product_name": "Malloc disk", 00:05:36.104 "block_size": 512, 00:05:36.104 "num_blocks": 16384, 00:05:36.104 "uuid": "cd9d0a99-563e-4e69-b81a-a1e77f428293", 00:05:36.104 "assigned_rate_limits": { 00:05:36.104 "rw_ios_per_sec": 0, 00:05:36.104 "rw_mbytes_per_sec": 0, 00:05:36.104 "r_mbytes_per_sec": 0, 00:05:36.104 "w_mbytes_per_sec": 0 00:05:36.104 }, 00:05:36.104 "claimed": true, 00:05:36.104 "claim_type": "exclusive_write", 00:05:36.104 "zoned": false, 00:05:36.104 "supported_io_types": { 00:05:36.104 "read": true, 00:05:36.104 "write": true, 00:05:36.104 "unmap": true, 00:05:36.104 "write_zeroes": true, 00:05:36.104 "flush": true, 00:05:36.104 "reset": true, 00:05:36.104 "compare": false, 00:05:36.104 "compare_and_write": false, 00:05:36.104 "abort": true, 00:05:36.104 "nvme_admin": false, 00:05:36.104 "nvme_io": false 00:05:36.104 }, 00:05:36.104 "memory_domains": [ 00:05:36.104 { 00:05:36.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.104 "dma_device_type": 2 00:05:36.104 } 00:05:36.104 ], 00:05:36.104 "driver_specific": {} 00:05:36.104 }, 00:05:36.104 { 00:05:36.104 "name": "Passthru0", 00:05:36.104 "aliases": [ 00:05:36.104 "a7494032-7986-5df5-a240-4422f1031ec4" 00:05:36.104 ], 00:05:36.104 "product_name": "passthru", 00:05:36.104 "block_size": 512, 00:05:36.104 "num_blocks": 16384, 00:05:36.104 "uuid": "a7494032-7986-5df5-a240-4422f1031ec4", 00:05:36.104 "assigned_rate_limits": { 00:05:36.104 "rw_ios_per_sec": 0, 00:05:36.104 "rw_mbytes_per_sec": 0, 00:05:36.104 "r_mbytes_per_sec": 0, 00:05:36.104 "w_mbytes_per_sec": 0 00:05:36.104 }, 00:05:36.104 "claimed": false, 00:05:36.104 "zoned": false, 00:05:36.104 "supported_io_types": { 00:05:36.104 "read": true, 00:05:36.104 "write": true, 00:05:36.104 "unmap": true, 00:05:36.104 "write_zeroes": true, 00:05:36.104 "flush": true, 00:05:36.104 "reset": true, 00:05:36.104 "compare": false, 00:05:36.104 "compare_and_write": false, 00:05:36.104 "abort": true, 00:05:36.104 "nvme_admin": false, 00:05:36.104 "nvme_io": false 00:05:36.104 }, 00:05:36.104 "memory_domains": [ 00:05:36.104 { 00:05:36.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.104 "dma_device_type": 2 00:05:36.104 } 00:05:36.104 ], 00:05:36.104 "driver_specific": { 00:05:36.104 "passthru": { 00:05:36.104 "name": "Passthru0", 00:05:36.104 "base_bdev_name": "Malloc0" 00:05:36.104 } 00:05:36.104 } 00:05:36.104 } 00:05:36.104 ]' 00:05:36.104 12:05:22 -- rpc/rpc.sh@21 -- # jq length 00:05:36.104 12:05:22 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:36.104 12:05:22 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:36.104 12:05:22 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.104 12:05:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.104 12:05:23 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:36.104 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.104 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.104 12:05:23 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:36.104 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.104 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.104 12:05:23 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:36.104 12:05:23 -- rpc/rpc.sh@26 -- # jq length 00:05:36.104 12:05:23 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:36.104 00:05:36.104 real 0m0.265s 00:05:36.104 user 0m0.159s 00:05:36.104 sys 0m0.047s 00:05:36.104 12:05:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.104 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.104 ************************************ 00:05:36.104 END TEST rpc_integrity 00:05:36.104 ************************************ 00:05:36.362 12:05:23 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:36.362 12:05:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.362 12:05:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.362 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.362 ************************************ 00:05:36.362 START TEST rpc_plugins 00:05:36.362 ************************************ 00:05:36.362 12:05:23 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:36.362 12:05:23 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:36.362 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.362 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.362 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.362 12:05:23 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:36.363 12:05:23 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:36.363 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.363 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.363 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.363 12:05:23 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:36.363 { 00:05:36.363 "name": "Malloc1", 00:05:36.363 "aliases": [ 00:05:36.363 "065828b9-92b8-41e4-a6f2-09c9d98621af" 00:05:36.363 ], 00:05:36.363 "product_name": "Malloc disk", 00:05:36.363 "block_size": 4096, 00:05:36.363 "num_blocks": 256, 00:05:36.363 "uuid": "065828b9-92b8-41e4-a6f2-09c9d98621af", 00:05:36.363 "assigned_rate_limits": { 00:05:36.363 "rw_ios_per_sec": 0, 00:05:36.363 "rw_mbytes_per_sec": 0, 00:05:36.363 "r_mbytes_per_sec": 0, 00:05:36.363 "w_mbytes_per_sec": 0 00:05:36.363 }, 00:05:36.363 "claimed": false, 00:05:36.363 "zoned": false, 00:05:36.363 "supported_io_types": { 00:05:36.363 "read": true, 00:05:36.363 "write": true, 00:05:36.363 "unmap": true, 00:05:36.363 "write_zeroes": true, 00:05:36.363 "flush": true, 00:05:36.363 "reset": true, 00:05:36.363 "compare": false, 00:05:36.363 "compare_and_write": false, 00:05:36.363 "abort": true, 00:05:36.363 "nvme_admin": false, 00:05:36.363 "nvme_io": false 00:05:36.363 }, 00:05:36.363 "memory_domains": [ 00:05:36.363 { 00:05:36.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.363 "dma_device_type": 2 00:05:36.363 } 00:05:36.363 ], 00:05:36.363 "driver_specific": {} 00:05:36.363 } 00:05:36.363 ]' 00:05:36.363 12:05:23 -- rpc/rpc.sh@32 -- # jq length 00:05:36.363 12:05:23 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:36.363 12:05:23 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:36.363 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.363 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.363 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.363 12:05:23 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:36.363 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.363 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.363 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.363 12:05:23 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:36.363 12:05:23 -- rpc/rpc.sh@36 -- # jq length 00:05:36.363 12:05:23 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:36.363 00:05:36.363 real 0m0.138s 00:05:36.363 user 0m0.085s 00:05:36.363 sys 0m0.025s 00:05:36.363 12:05:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.363 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.363 ************************************ 00:05:36.363 END TEST rpc_plugins 00:05:36.363 ************************************ 00:05:36.363 12:05:23 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:36.363 12:05:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.363 12:05:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.363 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.363 ************************************ 00:05:36.363 START TEST rpc_trace_cmd_test 00:05:36.363 ************************************ 00:05:36.363 12:05:23 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:36.363 12:05:23 -- rpc/rpc.sh@40 -- # local info 00:05:36.363 12:05:23 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:36.363 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.363 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.363 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.363 12:05:23 -- rpc/rpc.sh@42 -- # info='{ 00:05:36.363 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1121311", 00:05:36.363 "tpoint_group_mask": "0x8", 00:05:36.363 "iscsi_conn": { 00:05:36.363 "mask": "0x2", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "scsi": { 00:05:36.363 "mask": "0x4", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "bdev": { 00:05:36.363 "mask": "0x8", 00:05:36.363 "tpoint_mask": "0xffffffffffffffff" 00:05:36.363 }, 00:05:36.363 "nvmf_rdma": { 00:05:36.363 "mask": "0x10", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "nvmf_tcp": { 00:05:36.363 "mask": "0x20", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "ftl": { 00:05:36.363 "mask": "0x40", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "blobfs": { 00:05:36.363 "mask": "0x80", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "dsa": { 00:05:36.363 "mask": "0x200", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "thread": { 00:05:36.363 "mask": "0x400", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "nvme_pcie": { 00:05:36.363 "mask": "0x800", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "iaa": { 00:05:36.363 "mask": "0x1000", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "nvme_tcp": { 00:05:36.363 "mask": "0x2000", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 }, 00:05:36.363 "bdev_nvme": { 00:05:36.363 "mask": "0x4000", 00:05:36.363 "tpoint_mask": "0x0" 00:05:36.363 } 00:05:36.363 }' 00:05:36.363 12:05:23 -- rpc/rpc.sh@43 -- # jq length 00:05:36.621 12:05:23 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:36.621 12:05:23 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:36.621 12:05:23 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:36.621 12:05:23 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:36.621 12:05:23 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:36.621 12:05:23 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:36.621 12:05:23 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:36.621 12:05:23 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:36.621 12:05:23 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:36.621 00:05:36.621 real 0m0.219s 00:05:36.621 user 0m0.176s 00:05:36.621 sys 0m0.034s 00:05:36.621 12:05:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.621 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.621 ************************************ 00:05:36.621 END TEST rpc_trace_cmd_test 00:05:36.621 ************************************ 00:05:36.621 12:05:23 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:36.621 12:05:23 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:36.621 12:05:23 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:36.621 12:05:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:36.621 12:05:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:36.621 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.621 ************************************ 00:05:36.621 START TEST rpc_daemon_integrity 00:05:36.621 ************************************ 00:05:36.621 12:05:23 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:36.621 12:05:23 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:36.621 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.621 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.621 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.621 12:05:23 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:36.621 12:05:23 -- rpc/rpc.sh@13 -- # jq length 00:05:36.879 12:05:23 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:36.879 12:05:23 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:36.879 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.879 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.879 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.879 12:05:23 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:36.879 12:05:23 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:36.879 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.879 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.879 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.879 12:05:23 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:36.879 { 00:05:36.879 "name": "Malloc2", 00:05:36.879 "aliases": [ 00:05:36.879 "045eaad3-05b2-4ff9-8bc8-0f7c68952c9f" 00:05:36.879 ], 00:05:36.879 "product_name": "Malloc disk", 00:05:36.879 "block_size": 512, 00:05:36.879 "num_blocks": 16384, 00:05:36.879 "uuid": "045eaad3-05b2-4ff9-8bc8-0f7c68952c9f", 00:05:36.879 "assigned_rate_limits": { 00:05:36.879 "rw_ios_per_sec": 0, 00:05:36.879 "rw_mbytes_per_sec": 0, 00:05:36.879 "r_mbytes_per_sec": 0, 00:05:36.879 "w_mbytes_per_sec": 0 00:05:36.879 }, 00:05:36.879 "claimed": false, 00:05:36.879 "zoned": false, 00:05:36.879 "supported_io_types": { 00:05:36.879 "read": true, 00:05:36.879 "write": true, 00:05:36.879 "unmap": true, 00:05:36.879 "write_zeroes": true, 00:05:36.879 "flush": true, 00:05:36.879 "reset": true, 00:05:36.879 "compare": false, 00:05:36.879 "compare_and_write": false, 00:05:36.879 "abort": true, 00:05:36.879 "nvme_admin": false, 00:05:36.879 "nvme_io": false 00:05:36.879 }, 00:05:36.880 "memory_domains": [ 00:05:36.880 { 00:05:36.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.880 "dma_device_type": 2 00:05:36.880 } 00:05:36.880 ], 00:05:36.880 "driver_specific": {} 00:05:36.880 } 00:05:36.880 ]' 00:05:36.880 12:05:23 -- rpc/rpc.sh@17 -- # jq length 00:05:36.880 12:05:23 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:36.880 12:05:23 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:36.880 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.880 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.880 [2024-11-02 12:05:23.682378] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:36.880 [2024-11-02 12:05:23.682410] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:36.880 [2024-11-02 12:05:23.682425] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4ee44c0 00:05:36.880 [2024-11-02 12:05:23.682435] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:36.880 [2024-11-02 12:05:23.683120] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:36.880 [2024-11-02 12:05:23.683141] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:36.880 Passthru0 00:05:36.880 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.880 12:05:23 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:36.880 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.880 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.880 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.880 12:05:23 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:36.880 { 00:05:36.880 "name": "Malloc2", 00:05:36.880 "aliases": [ 00:05:36.880 "045eaad3-05b2-4ff9-8bc8-0f7c68952c9f" 00:05:36.880 ], 00:05:36.880 "product_name": "Malloc disk", 00:05:36.880 "block_size": 512, 00:05:36.880 "num_blocks": 16384, 00:05:36.880 "uuid": "045eaad3-05b2-4ff9-8bc8-0f7c68952c9f", 00:05:36.880 "assigned_rate_limits": { 00:05:36.880 "rw_ios_per_sec": 0, 00:05:36.880 "rw_mbytes_per_sec": 0, 00:05:36.880 "r_mbytes_per_sec": 0, 00:05:36.880 "w_mbytes_per_sec": 0 00:05:36.880 }, 00:05:36.880 "claimed": true, 00:05:36.880 "claim_type": "exclusive_write", 00:05:36.880 "zoned": false, 00:05:36.880 "supported_io_types": { 00:05:36.880 "read": true, 00:05:36.880 "write": true, 00:05:36.880 "unmap": true, 00:05:36.880 "write_zeroes": true, 00:05:36.880 "flush": true, 00:05:36.880 "reset": true, 00:05:36.880 "compare": false, 00:05:36.880 "compare_and_write": false, 00:05:36.880 "abort": true, 00:05:36.880 "nvme_admin": false, 00:05:36.880 "nvme_io": false 00:05:36.880 }, 00:05:36.880 "memory_domains": [ 00:05:36.880 { 00:05:36.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.880 "dma_device_type": 2 00:05:36.880 } 00:05:36.880 ], 00:05:36.880 "driver_specific": {} 00:05:36.880 }, 00:05:36.880 { 00:05:36.880 "name": "Passthru0", 00:05:36.880 "aliases": [ 00:05:36.880 "a34dc748-f12e-550e-8687-689ffa393f0c" 00:05:36.880 ], 00:05:36.880 "product_name": "passthru", 00:05:36.880 "block_size": 512, 00:05:36.880 "num_blocks": 16384, 00:05:36.880 "uuid": "a34dc748-f12e-550e-8687-689ffa393f0c", 00:05:36.880 "assigned_rate_limits": { 00:05:36.880 "rw_ios_per_sec": 0, 00:05:36.880 "rw_mbytes_per_sec": 0, 00:05:36.880 "r_mbytes_per_sec": 0, 00:05:36.880 "w_mbytes_per_sec": 0 00:05:36.880 }, 00:05:36.880 "claimed": false, 00:05:36.880 "zoned": false, 00:05:36.880 "supported_io_types": { 00:05:36.880 "read": true, 00:05:36.880 "write": true, 00:05:36.880 "unmap": true, 00:05:36.880 "write_zeroes": true, 00:05:36.880 "flush": true, 00:05:36.880 "reset": true, 00:05:36.880 "compare": false, 00:05:36.880 "compare_and_write": false, 00:05:36.880 "abort": true, 00:05:36.880 "nvme_admin": false, 00:05:36.880 "nvme_io": false 00:05:36.880 }, 00:05:36.880 "memory_domains": [ 00:05:36.880 { 00:05:36.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:36.880 "dma_device_type": 2 00:05:36.880 } 00:05:36.880 ], 00:05:36.880 "driver_specific": { 00:05:36.880 "passthru": { 00:05:36.880 "name": "Passthru0", 00:05:36.880 "base_bdev_name": "Malloc2" 00:05:36.880 } 00:05:36.880 } 00:05:36.880 } 00:05:36.880 ]' 00:05:36.880 12:05:23 -- rpc/rpc.sh@21 -- # jq length 00:05:36.880 12:05:23 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:36.880 12:05:23 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:36.880 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.880 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.880 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.880 12:05:23 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:36.880 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.880 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.880 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.880 12:05:23 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:36.880 12:05:23 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:36.880 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.880 12:05:23 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:36.880 12:05:23 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:36.880 12:05:23 -- rpc/rpc.sh@26 -- # jq length 00:05:36.880 12:05:23 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:36.880 00:05:36.880 real 0m0.265s 00:05:36.880 user 0m0.172s 00:05:36.880 sys 0m0.042s 00:05:36.880 12:05:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.880 12:05:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.880 ************************************ 00:05:36.880 END TEST rpc_daemon_integrity 00:05:36.880 ************************************ 00:05:37.139 12:05:23 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:37.139 12:05:23 -- rpc/rpc.sh@84 -- # killprocess 1121311 00:05:37.139 12:05:23 -- common/autotest_common.sh@926 -- # '[' -z 1121311 ']' 00:05:37.139 12:05:23 -- common/autotest_common.sh@930 -- # kill -0 1121311 00:05:37.139 12:05:23 -- common/autotest_common.sh@931 -- # uname 00:05:37.139 12:05:23 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:37.139 12:05:23 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1121311 00:05:37.139 12:05:23 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:37.139 12:05:23 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:37.139 12:05:23 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1121311' 00:05:37.139 killing process with pid 1121311 00:05:37.139 12:05:23 -- common/autotest_common.sh@945 -- # kill 1121311 00:05:37.139 12:05:23 -- common/autotest_common.sh@950 -- # wait 1121311 00:05:37.397 00:05:37.397 real 0m2.362s 00:05:37.397 user 0m2.999s 00:05:37.397 sys 0m0.703s 00:05:37.397 12:05:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.397 12:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:37.397 ************************************ 00:05:37.397 END TEST rpc 00:05:37.397 ************************************ 00:05:37.397 12:05:24 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:37.397 12:05:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.397 12:05:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.397 12:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:37.397 ************************************ 00:05:37.397 START TEST rpc_client 00:05:37.397 ************************************ 00:05:37.397 12:05:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:37.397 * Looking for test storage... 00:05:37.397 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:37.397 12:05:24 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:37.655 OK 00:05:37.655 12:05:24 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:37.655 00:05:37.655 real 0m0.124s 00:05:37.655 user 0m0.047s 00:05:37.655 sys 0m0.088s 00:05:37.655 12:05:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.655 12:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:37.655 ************************************ 00:05:37.655 END TEST rpc_client 00:05:37.655 ************************************ 00:05:37.655 12:05:24 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:37.655 12:05:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.655 12:05:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.655 12:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:37.655 ************************************ 00:05:37.655 START TEST json_config 00:05:37.655 ************************************ 00:05:37.655 12:05:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:37.655 12:05:24 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:37.655 12:05:24 -- nvmf/common.sh@7 -- # uname -s 00:05:37.655 12:05:24 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:37.655 12:05:24 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:37.655 12:05:24 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:37.655 12:05:24 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:37.655 12:05:24 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:37.655 12:05:24 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:37.655 12:05:24 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:37.655 12:05:24 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:37.655 12:05:24 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:37.655 12:05:24 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:37.655 12:05:24 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:37.655 12:05:24 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:37.655 12:05:24 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:37.655 12:05:24 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:37.655 12:05:24 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:37.655 12:05:24 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:37.655 12:05:24 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:37.655 12:05:24 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:37.655 12:05:24 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:37.655 12:05:24 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.655 12:05:24 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.655 12:05:24 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.655 12:05:24 -- paths/export.sh@5 -- # export PATH 00:05:37.655 12:05:24 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.655 12:05:24 -- nvmf/common.sh@46 -- # : 0 00:05:37.655 12:05:24 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:37.655 12:05:24 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:37.655 12:05:24 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:37.655 12:05:24 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:37.655 12:05:24 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:37.655 12:05:24 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:37.655 12:05:24 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:37.655 12:05:24 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:37.655 12:05:24 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:37.655 12:05:24 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:37.655 12:05:24 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:37.655 12:05:24 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:37.655 12:05:24 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:37.655 WARNING: No tests are enabled so not running JSON configuration tests 00:05:37.655 12:05:24 -- json_config/json_config.sh@27 -- # exit 0 00:05:37.655 00:05:37.655 real 0m0.105s 00:05:37.655 user 0m0.050s 00:05:37.655 sys 0m0.056s 00:05:37.655 12:05:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:37.655 12:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:37.655 ************************************ 00:05:37.655 END TEST json_config 00:05:37.655 ************************************ 00:05:37.655 12:05:24 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:37.655 12:05:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:37.655 12:05:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:37.655 12:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:37.655 ************************************ 00:05:37.655 START TEST json_config_extra_key 00:05:37.655 ************************************ 00:05:37.655 12:05:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:37.913 12:05:24 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:37.913 12:05:24 -- nvmf/common.sh@7 -- # uname -s 00:05:37.913 12:05:24 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:37.913 12:05:24 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:37.913 12:05:24 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:37.913 12:05:24 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:37.913 12:05:24 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:37.913 12:05:24 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:37.913 12:05:24 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:37.913 12:05:24 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:37.913 12:05:24 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:37.913 12:05:24 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:37.913 12:05:24 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:37.913 12:05:24 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:37.913 12:05:24 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:37.913 12:05:24 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:37.913 12:05:24 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:37.913 12:05:24 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:37.913 12:05:24 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:37.913 12:05:24 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:37.913 12:05:24 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:37.913 12:05:24 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.913 12:05:24 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.913 12:05:24 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.913 12:05:24 -- paths/export.sh@5 -- # export PATH 00:05:37.914 12:05:24 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:37.914 12:05:24 -- nvmf/common.sh@46 -- # : 0 00:05:37.914 12:05:24 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:37.914 12:05:24 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:37.914 12:05:24 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:37.914 12:05:24 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:37.914 12:05:24 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:37.914 12:05:24 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:37.914 12:05:24 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:37.914 12:05:24 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:37.914 INFO: launching applications... 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=1122056 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:37.914 Waiting for target to run... 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 1122056 /var/tmp/spdk_tgt.sock 00:05:37.914 12:05:24 -- common/autotest_common.sh@819 -- # '[' -z 1122056 ']' 00:05:37.914 12:05:24 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:37.914 12:05:24 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:37.914 12:05:24 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:37.914 12:05:24 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:37.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:37.914 12:05:24 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:37.914 12:05:24 -- common/autotest_common.sh@10 -- # set +x 00:05:37.914 [2024-11-02 12:05:24.721773] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:37.914 [2024-11-02 12:05:24.721862] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122056 ] 00:05:37.914 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.478 [2024-11-02 12:05:25.159507] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.478 [2024-11-02 12:05:25.186779] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:38.478 [2024-11-02 12:05:25.186890] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.736 12:05:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:38.736 12:05:25 -- common/autotest_common.sh@852 -- # return 0 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:38.736 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:38.736 INFO: shutting down applications... 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 1122056 ]] 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 1122056 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1122056 00:05:38.736 12:05:25 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:39.301 12:05:26 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:39.301 12:05:26 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:39.301 12:05:26 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1122056 00:05:39.301 12:05:26 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:39.301 12:05:26 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:39.302 12:05:26 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:39.302 12:05:26 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:39.302 SPDK target shutdown done 00:05:39.302 12:05:26 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:39.302 Success 00:05:39.302 00:05:39.302 real 0m1.461s 00:05:39.302 user 0m1.044s 00:05:39.302 sys 0m0.554s 00:05:39.302 12:05:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.302 12:05:26 -- common/autotest_common.sh@10 -- # set +x 00:05:39.302 ************************************ 00:05:39.302 END TEST json_config_extra_key 00:05:39.302 ************************************ 00:05:39.302 12:05:26 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:39.302 12:05:26 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:39.302 12:05:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:39.302 12:05:26 -- common/autotest_common.sh@10 -- # set +x 00:05:39.302 ************************************ 00:05:39.302 START TEST alias_rpc 00:05:39.302 ************************************ 00:05:39.302 12:05:26 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:39.302 * Looking for test storage... 00:05:39.302 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:39.302 12:05:26 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:39.302 12:05:26 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:39.302 12:05:26 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1122348 00:05:39.302 12:05:26 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1122348 00:05:39.302 12:05:26 -- common/autotest_common.sh@819 -- # '[' -z 1122348 ']' 00:05:39.302 12:05:26 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:39.302 12:05:26 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:39.302 12:05:26 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:39.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:39.302 12:05:26 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:39.302 12:05:26 -- common/autotest_common.sh@10 -- # set +x 00:05:39.302 [2024-11-02 12:05:26.205608] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:39.302 [2024-11-02 12:05:26.205674] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122348 ] 00:05:39.302 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.559 [2024-11-02 12:05:26.281790] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.559 [2024-11-02 12:05:26.334209] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:39.559 [2024-11-02 12:05:26.334390] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.493 12:05:27 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:40.493 12:05:27 -- common/autotest_common.sh@852 -- # return 0 00:05:40.493 12:05:27 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:40.493 12:05:27 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1122348 00:05:40.493 12:05:27 -- common/autotest_common.sh@926 -- # '[' -z 1122348 ']' 00:05:40.493 12:05:27 -- common/autotest_common.sh@930 -- # kill -0 1122348 00:05:40.493 12:05:27 -- common/autotest_common.sh@931 -- # uname 00:05:40.493 12:05:27 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:40.493 12:05:27 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1122348 00:05:40.493 12:05:27 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:40.493 12:05:27 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:40.493 12:05:27 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1122348' 00:05:40.493 killing process with pid 1122348 00:05:40.493 12:05:27 -- common/autotest_common.sh@945 -- # kill 1122348 00:05:40.493 12:05:27 -- common/autotest_common.sh@950 -- # wait 1122348 00:05:40.751 00:05:40.751 real 0m1.603s 00:05:40.751 user 0m1.835s 00:05:40.751 sys 0m0.444s 00:05:40.751 12:05:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:40.751 12:05:27 -- common/autotest_common.sh@10 -- # set +x 00:05:40.751 ************************************ 00:05:40.751 END TEST alias_rpc 00:05:40.751 ************************************ 00:05:41.010 12:05:27 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:41.010 12:05:27 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:41.010 12:05:27 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:41.010 12:05:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:41.010 12:05:27 -- common/autotest_common.sh@10 -- # set +x 00:05:41.010 ************************************ 00:05:41.010 START TEST spdkcli_tcp 00:05:41.010 ************************************ 00:05:41.010 12:05:27 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:41.010 * Looking for test storage... 00:05:41.010 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:41.010 12:05:27 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:41.010 12:05:27 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:41.010 12:05:27 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:41.010 12:05:27 -- common/autotest_common.sh@10 -- # set +x 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1122728 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@27 -- # waitforlisten 1122728 00:05:41.010 12:05:27 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:41.010 12:05:27 -- common/autotest_common.sh@819 -- # '[' -z 1122728 ']' 00:05:41.010 12:05:27 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.010 12:05:27 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:41.010 12:05:27 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.010 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.010 12:05:27 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:41.010 12:05:27 -- common/autotest_common.sh@10 -- # set +x 00:05:41.010 [2024-11-02 12:05:27.887764] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:41.010 [2024-11-02 12:05:27.887851] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122728 ] 00:05:41.010 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.010 [2024-11-02 12:05:27.954606] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:41.269 [2024-11-02 12:05:27.992443] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:41.269 [2024-11-02 12:05:27.992593] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:41.269 [2024-11-02 12:05:27.992596] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.835 12:05:28 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:41.835 12:05:28 -- common/autotest_common.sh@852 -- # return 0 00:05:41.835 12:05:28 -- spdkcli/tcp.sh@31 -- # socat_pid=1122749 00:05:41.835 12:05:28 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:41.835 12:05:28 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:42.094 [ 00:05:42.094 "spdk_get_version", 00:05:42.094 "rpc_get_methods", 00:05:42.094 "trace_get_info", 00:05:42.094 "trace_get_tpoint_group_mask", 00:05:42.094 "trace_disable_tpoint_group", 00:05:42.094 "trace_enable_tpoint_group", 00:05:42.094 "trace_clear_tpoint_mask", 00:05:42.094 "trace_set_tpoint_mask", 00:05:42.094 "vfu_tgt_set_base_path", 00:05:42.094 "framework_get_pci_devices", 00:05:42.094 "framework_get_config", 00:05:42.094 "framework_get_subsystems", 00:05:42.094 "iobuf_get_stats", 00:05:42.094 "iobuf_set_options", 00:05:42.094 "sock_set_default_impl", 00:05:42.094 "sock_impl_set_options", 00:05:42.094 "sock_impl_get_options", 00:05:42.094 "vmd_rescan", 00:05:42.094 "vmd_remove_device", 00:05:42.094 "vmd_enable", 00:05:42.094 "accel_get_stats", 00:05:42.094 "accel_set_options", 00:05:42.094 "accel_set_driver", 00:05:42.094 "accel_crypto_key_destroy", 00:05:42.094 "accel_crypto_keys_get", 00:05:42.094 "accel_crypto_key_create", 00:05:42.094 "accel_assign_opc", 00:05:42.094 "accel_get_module_info", 00:05:42.094 "accel_get_opc_assignments", 00:05:42.094 "notify_get_notifications", 00:05:42.094 "notify_get_types", 00:05:42.094 "bdev_get_histogram", 00:05:42.094 "bdev_enable_histogram", 00:05:42.094 "bdev_set_qos_limit", 00:05:42.094 "bdev_set_qd_sampling_period", 00:05:42.094 "bdev_get_bdevs", 00:05:42.094 "bdev_reset_iostat", 00:05:42.094 "bdev_get_iostat", 00:05:42.094 "bdev_examine", 00:05:42.094 "bdev_wait_for_examine", 00:05:42.094 "bdev_set_options", 00:05:42.094 "scsi_get_devices", 00:05:42.094 "thread_set_cpumask", 00:05:42.094 "framework_get_scheduler", 00:05:42.094 "framework_set_scheduler", 00:05:42.094 "framework_get_reactors", 00:05:42.094 "thread_get_io_channels", 00:05:42.094 "thread_get_pollers", 00:05:42.094 "thread_get_stats", 00:05:42.094 "framework_monitor_context_switch", 00:05:42.094 "spdk_kill_instance", 00:05:42.094 "log_enable_timestamps", 00:05:42.094 "log_get_flags", 00:05:42.094 "log_clear_flag", 00:05:42.094 "log_set_flag", 00:05:42.094 "log_get_level", 00:05:42.094 "log_set_level", 00:05:42.094 "log_get_print_level", 00:05:42.094 "log_set_print_level", 00:05:42.094 "framework_enable_cpumask_locks", 00:05:42.094 "framework_disable_cpumask_locks", 00:05:42.094 "framework_wait_init", 00:05:42.094 "framework_start_init", 00:05:42.094 "virtio_blk_create_transport", 00:05:42.094 "virtio_blk_get_transports", 00:05:42.094 "vhost_controller_set_coalescing", 00:05:42.094 "vhost_get_controllers", 00:05:42.094 "vhost_delete_controller", 00:05:42.094 "vhost_create_blk_controller", 00:05:42.094 "vhost_scsi_controller_remove_target", 00:05:42.094 "vhost_scsi_controller_add_target", 00:05:42.094 "vhost_start_scsi_controller", 00:05:42.094 "vhost_create_scsi_controller", 00:05:42.094 "ublk_recover_disk", 00:05:42.094 "ublk_get_disks", 00:05:42.094 "ublk_stop_disk", 00:05:42.094 "ublk_start_disk", 00:05:42.094 "ublk_destroy_target", 00:05:42.094 "ublk_create_target", 00:05:42.094 "nbd_get_disks", 00:05:42.094 "nbd_stop_disk", 00:05:42.094 "nbd_start_disk", 00:05:42.094 "env_dpdk_get_mem_stats", 00:05:42.094 "nvmf_subsystem_get_listeners", 00:05:42.094 "nvmf_subsystem_get_qpairs", 00:05:42.094 "nvmf_subsystem_get_controllers", 00:05:42.094 "nvmf_get_stats", 00:05:42.094 "nvmf_get_transports", 00:05:42.094 "nvmf_create_transport", 00:05:42.094 "nvmf_get_targets", 00:05:42.094 "nvmf_delete_target", 00:05:42.094 "nvmf_create_target", 00:05:42.094 "nvmf_subsystem_allow_any_host", 00:05:42.094 "nvmf_subsystem_remove_host", 00:05:42.094 "nvmf_subsystem_add_host", 00:05:42.094 "nvmf_subsystem_remove_ns", 00:05:42.094 "nvmf_subsystem_add_ns", 00:05:42.094 "nvmf_subsystem_listener_set_ana_state", 00:05:42.094 "nvmf_discovery_get_referrals", 00:05:42.094 "nvmf_discovery_remove_referral", 00:05:42.094 "nvmf_discovery_add_referral", 00:05:42.094 "nvmf_subsystem_remove_listener", 00:05:42.094 "nvmf_subsystem_add_listener", 00:05:42.094 "nvmf_delete_subsystem", 00:05:42.094 "nvmf_create_subsystem", 00:05:42.094 "nvmf_get_subsystems", 00:05:42.094 "nvmf_set_crdt", 00:05:42.094 "nvmf_set_config", 00:05:42.094 "nvmf_set_max_subsystems", 00:05:42.094 "iscsi_set_options", 00:05:42.094 "iscsi_get_auth_groups", 00:05:42.094 "iscsi_auth_group_remove_secret", 00:05:42.094 "iscsi_auth_group_add_secret", 00:05:42.094 "iscsi_delete_auth_group", 00:05:42.094 "iscsi_create_auth_group", 00:05:42.094 "iscsi_set_discovery_auth", 00:05:42.094 "iscsi_get_options", 00:05:42.094 "iscsi_target_node_request_logout", 00:05:42.094 "iscsi_target_node_set_redirect", 00:05:42.094 "iscsi_target_node_set_auth", 00:05:42.094 "iscsi_target_node_add_lun", 00:05:42.094 "iscsi_get_connections", 00:05:42.094 "iscsi_portal_group_set_auth", 00:05:42.094 "iscsi_start_portal_group", 00:05:42.094 "iscsi_delete_portal_group", 00:05:42.094 "iscsi_create_portal_group", 00:05:42.094 "iscsi_get_portal_groups", 00:05:42.094 "iscsi_delete_target_node", 00:05:42.094 "iscsi_target_node_remove_pg_ig_maps", 00:05:42.094 "iscsi_target_node_add_pg_ig_maps", 00:05:42.094 "iscsi_create_target_node", 00:05:42.094 "iscsi_get_target_nodes", 00:05:42.094 "iscsi_delete_initiator_group", 00:05:42.094 "iscsi_initiator_group_remove_initiators", 00:05:42.094 "iscsi_initiator_group_add_initiators", 00:05:42.094 "iscsi_create_initiator_group", 00:05:42.094 "iscsi_get_initiator_groups", 00:05:42.094 "vfu_virtio_create_scsi_endpoint", 00:05:42.094 "vfu_virtio_scsi_remove_target", 00:05:42.094 "vfu_virtio_scsi_add_target", 00:05:42.094 "vfu_virtio_create_blk_endpoint", 00:05:42.094 "vfu_virtio_delete_endpoint", 00:05:42.094 "iaa_scan_accel_module", 00:05:42.094 "dsa_scan_accel_module", 00:05:42.094 "ioat_scan_accel_module", 00:05:42.094 "accel_error_inject_error", 00:05:42.094 "bdev_iscsi_delete", 00:05:42.094 "bdev_iscsi_create", 00:05:42.094 "bdev_iscsi_set_options", 00:05:42.094 "bdev_virtio_attach_controller", 00:05:42.094 "bdev_virtio_scsi_get_devices", 00:05:42.094 "bdev_virtio_detach_controller", 00:05:42.094 "bdev_virtio_blk_set_hotplug", 00:05:42.094 "bdev_ftl_set_property", 00:05:42.094 "bdev_ftl_get_properties", 00:05:42.094 "bdev_ftl_get_stats", 00:05:42.094 "bdev_ftl_unmap", 00:05:42.094 "bdev_ftl_unload", 00:05:42.094 "bdev_ftl_delete", 00:05:42.094 "bdev_ftl_load", 00:05:42.094 "bdev_ftl_create", 00:05:42.094 "bdev_aio_delete", 00:05:42.094 "bdev_aio_rescan", 00:05:42.094 "bdev_aio_create", 00:05:42.094 "blobfs_create", 00:05:42.094 "blobfs_detect", 00:05:42.094 "blobfs_set_cache_size", 00:05:42.094 "bdev_zone_block_delete", 00:05:42.095 "bdev_zone_block_create", 00:05:42.095 "bdev_delay_delete", 00:05:42.095 "bdev_delay_create", 00:05:42.095 "bdev_delay_update_latency", 00:05:42.095 "bdev_split_delete", 00:05:42.095 "bdev_split_create", 00:05:42.095 "bdev_error_inject_error", 00:05:42.095 "bdev_error_delete", 00:05:42.095 "bdev_error_create", 00:05:42.095 "bdev_raid_set_options", 00:05:42.095 "bdev_raid_remove_base_bdev", 00:05:42.095 "bdev_raid_add_base_bdev", 00:05:42.095 "bdev_raid_delete", 00:05:42.095 "bdev_raid_create", 00:05:42.095 "bdev_raid_get_bdevs", 00:05:42.095 "bdev_lvol_grow_lvstore", 00:05:42.095 "bdev_lvol_get_lvols", 00:05:42.095 "bdev_lvol_get_lvstores", 00:05:42.095 "bdev_lvol_delete", 00:05:42.095 "bdev_lvol_set_read_only", 00:05:42.095 "bdev_lvol_resize", 00:05:42.095 "bdev_lvol_decouple_parent", 00:05:42.095 "bdev_lvol_inflate", 00:05:42.095 "bdev_lvol_rename", 00:05:42.095 "bdev_lvol_clone_bdev", 00:05:42.095 "bdev_lvol_clone", 00:05:42.095 "bdev_lvol_snapshot", 00:05:42.095 "bdev_lvol_create", 00:05:42.095 "bdev_lvol_delete_lvstore", 00:05:42.095 "bdev_lvol_rename_lvstore", 00:05:42.095 "bdev_lvol_create_lvstore", 00:05:42.095 "bdev_passthru_delete", 00:05:42.095 "bdev_passthru_create", 00:05:42.095 "bdev_nvme_cuse_unregister", 00:05:42.095 "bdev_nvme_cuse_register", 00:05:42.095 "bdev_opal_new_user", 00:05:42.095 "bdev_opal_set_lock_state", 00:05:42.095 "bdev_opal_delete", 00:05:42.095 "bdev_opal_get_info", 00:05:42.095 "bdev_opal_create", 00:05:42.095 "bdev_nvme_opal_revert", 00:05:42.095 "bdev_nvme_opal_init", 00:05:42.095 "bdev_nvme_send_cmd", 00:05:42.095 "bdev_nvme_get_path_iostat", 00:05:42.095 "bdev_nvme_get_mdns_discovery_info", 00:05:42.095 "bdev_nvme_stop_mdns_discovery", 00:05:42.095 "bdev_nvme_start_mdns_discovery", 00:05:42.095 "bdev_nvme_set_multipath_policy", 00:05:42.095 "bdev_nvme_set_preferred_path", 00:05:42.095 "bdev_nvme_get_io_paths", 00:05:42.095 "bdev_nvme_remove_error_injection", 00:05:42.095 "bdev_nvme_add_error_injection", 00:05:42.095 "bdev_nvme_get_discovery_info", 00:05:42.095 "bdev_nvme_stop_discovery", 00:05:42.095 "bdev_nvme_start_discovery", 00:05:42.095 "bdev_nvme_get_controller_health_info", 00:05:42.095 "bdev_nvme_disable_controller", 00:05:42.095 "bdev_nvme_enable_controller", 00:05:42.095 "bdev_nvme_reset_controller", 00:05:42.095 "bdev_nvme_get_transport_statistics", 00:05:42.095 "bdev_nvme_apply_firmware", 00:05:42.095 "bdev_nvme_detach_controller", 00:05:42.095 "bdev_nvme_get_controllers", 00:05:42.095 "bdev_nvme_attach_controller", 00:05:42.095 "bdev_nvme_set_hotplug", 00:05:42.095 "bdev_nvme_set_options", 00:05:42.095 "bdev_null_resize", 00:05:42.095 "bdev_null_delete", 00:05:42.095 "bdev_null_create", 00:05:42.095 "bdev_malloc_delete", 00:05:42.095 "bdev_malloc_create" 00:05:42.095 ] 00:05:42.095 12:05:28 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:42.095 12:05:28 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:42.095 12:05:28 -- common/autotest_common.sh@10 -- # set +x 00:05:42.095 12:05:28 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:42.095 12:05:28 -- spdkcli/tcp.sh@38 -- # killprocess 1122728 00:05:42.095 12:05:28 -- common/autotest_common.sh@926 -- # '[' -z 1122728 ']' 00:05:42.095 12:05:28 -- common/autotest_common.sh@930 -- # kill -0 1122728 00:05:42.095 12:05:28 -- common/autotest_common.sh@931 -- # uname 00:05:42.095 12:05:28 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:42.095 12:05:28 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1122728 00:05:42.095 12:05:28 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:42.095 12:05:28 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:42.095 12:05:28 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1122728' 00:05:42.095 killing process with pid 1122728 00:05:42.095 12:05:28 -- common/autotest_common.sh@945 -- # kill 1122728 00:05:42.095 12:05:29 -- common/autotest_common.sh@950 -- # wait 1122728 00:05:42.353 00:05:42.353 real 0m1.536s 00:05:42.353 user 0m2.906s 00:05:42.353 sys 0m0.489s 00:05:42.353 12:05:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.353 12:05:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.353 ************************************ 00:05:42.353 END TEST spdkcli_tcp 00:05:42.353 ************************************ 00:05:42.611 12:05:29 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:42.611 12:05:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.611 12:05:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.611 12:05:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.611 ************************************ 00:05:42.611 START TEST dpdk_mem_utility 00:05:42.611 ************************************ 00:05:42.611 12:05:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:42.611 * Looking for test storage... 00:05:42.611 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:42.611 12:05:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:42.611 12:05:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1123061 00:05:42.611 12:05:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1123061 00:05:42.611 12:05:29 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.611 12:05:29 -- common/autotest_common.sh@819 -- # '[' -z 1123061 ']' 00:05:42.611 12:05:29 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.611 12:05:29 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:42.611 12:05:29 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.611 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.611 12:05:29 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:42.611 12:05:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.611 [2024-11-02 12:05:29.458753] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:42.611 [2024-11-02 12:05:29.458858] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1123061 ] 00:05:42.611 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.611 [2024-11-02 12:05:29.524868] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.611 [2024-11-02 12:05:29.561217] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:42.611 [2024-11-02 12:05:29.561347] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.547 12:05:30 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:43.547 12:05:30 -- common/autotest_common.sh@852 -- # return 0 00:05:43.547 12:05:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:43.547 12:05:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:43.547 12:05:30 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:43.547 12:05:30 -- common/autotest_common.sh@10 -- # set +x 00:05:43.547 { 00:05:43.547 "filename": "/tmp/spdk_mem_dump.txt" 00:05:43.547 } 00:05:43.547 12:05:30 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:43.547 12:05:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:43.547 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:43.547 1 heaps totaling size 814.000000 MiB 00:05:43.547 size: 814.000000 MiB heap id: 0 00:05:43.547 end heaps---------- 00:05:43.547 8 mempools totaling size 598.116089 MiB 00:05:43.547 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:43.547 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:43.547 size: 84.521057 MiB name: bdev_io_1123061 00:05:43.547 size: 51.011292 MiB name: evtpool_1123061 00:05:43.547 size: 50.003479 MiB name: msgpool_1123061 00:05:43.547 size: 21.763794 MiB name: PDU_Pool 00:05:43.547 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:43.547 size: 0.026123 MiB name: Session_Pool 00:05:43.547 end mempools------- 00:05:43.547 6 memzones totaling size 4.142822 MiB 00:05:43.547 size: 1.000366 MiB name: RG_ring_0_1123061 00:05:43.547 size: 1.000366 MiB name: RG_ring_1_1123061 00:05:43.547 size: 1.000366 MiB name: RG_ring_4_1123061 00:05:43.547 size: 1.000366 MiB name: RG_ring_5_1123061 00:05:43.547 size: 0.125366 MiB name: RG_ring_2_1123061 00:05:43.547 size: 0.015991 MiB name: RG_ring_3_1123061 00:05:43.547 end memzones------- 00:05:43.547 12:05:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:43.547 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:43.547 list of free elements. size: 12.519348 MiB 00:05:43.547 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:43.547 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:43.547 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:43.547 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:43.547 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:43.547 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:43.547 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:43.547 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:43.547 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:43.547 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:43.547 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:43.547 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:43.547 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:43.547 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:43.547 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:43.547 list of standard malloc elements. size: 199.218079 MiB 00:05:43.547 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:43.547 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:43.547 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:43.547 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:43.547 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:43.547 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:43.547 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:43.547 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:43.547 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:43.547 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:43.547 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:43.547 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:43.547 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:43.547 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:43.547 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:43.547 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:43.547 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:43.547 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:43.547 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:43.547 list of memzone associated elements. size: 602.262573 MiB 00:05:43.547 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:43.547 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:43.547 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:43.547 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:43.547 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:43.547 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1123061_0 00:05:43.547 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:43.547 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1123061_0 00:05:43.547 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:43.547 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1123061_0 00:05:43.547 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:43.547 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:43.547 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:43.547 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:43.547 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:43.547 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1123061 00:05:43.547 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:43.547 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1123061 00:05:43.547 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:43.547 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1123061 00:05:43.547 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:43.547 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:43.547 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:43.547 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:43.547 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:43.547 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:43.547 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:43.547 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:43.548 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:43.548 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1123061 00:05:43.548 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:43.548 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1123061 00:05:43.548 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:43.548 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1123061 00:05:43.548 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:43.548 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1123061 00:05:43.548 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:43.548 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1123061 00:05:43.548 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:43.548 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:43.548 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:43.548 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:43.548 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:43.548 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:43.548 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:43.548 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1123061 00:05:43.548 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:43.548 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:43.548 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:43.548 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:43.548 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:43.548 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1123061 00:05:43.548 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:43.548 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:43.548 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:43.548 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1123061 00:05:43.548 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:43.548 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1123061 00:05:43.548 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:43.548 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:43.548 12:05:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:43.548 12:05:30 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1123061 00:05:43.548 12:05:30 -- common/autotest_common.sh@926 -- # '[' -z 1123061 ']' 00:05:43.548 12:05:30 -- common/autotest_common.sh@930 -- # kill -0 1123061 00:05:43.548 12:05:30 -- common/autotest_common.sh@931 -- # uname 00:05:43.548 12:05:30 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:43.548 12:05:30 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1123061 00:05:43.548 12:05:30 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:43.548 12:05:30 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:43.548 12:05:30 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1123061' 00:05:43.548 killing process with pid 1123061 00:05:43.548 12:05:30 -- common/autotest_common.sh@945 -- # kill 1123061 00:05:43.548 12:05:30 -- common/autotest_common.sh@950 -- # wait 1123061 00:05:43.807 00:05:43.807 real 0m1.398s 00:05:43.807 user 0m1.462s 00:05:43.807 sys 0m0.421s 00:05:43.807 12:05:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.807 12:05:30 -- common/autotest_common.sh@10 -- # set +x 00:05:43.807 ************************************ 00:05:43.807 END TEST dpdk_mem_utility 00:05:43.807 ************************************ 00:05:43.807 12:05:30 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:43.807 12:05:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:43.807 12:05:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:43.807 12:05:30 -- common/autotest_common.sh@10 -- # set +x 00:05:43.807 ************************************ 00:05:43.807 START TEST event 00:05:43.807 ************************************ 00:05:43.807 12:05:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:44.066 * Looking for test storage... 00:05:44.066 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:44.066 12:05:30 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:44.066 12:05:30 -- bdev/nbd_common.sh@6 -- # set -e 00:05:44.066 12:05:30 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:44.066 12:05:30 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:44.066 12:05:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:44.066 12:05:30 -- common/autotest_common.sh@10 -- # set +x 00:05:44.066 ************************************ 00:05:44.066 START TEST event_perf 00:05:44.066 ************************************ 00:05:44.066 12:05:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:44.066 Running I/O for 1 seconds...[2024-11-02 12:05:30.904195] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:44.066 [2024-11-02 12:05:30.904334] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1123357 ] 00:05:44.066 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.066 [2024-11-02 12:05:30.974010] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:44.066 [2024-11-02 12:05:31.013634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:44.066 [2024-11-02 12:05:31.013728] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:44.066 [2024-11-02 12:05:31.013987] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:44.066 [2024-11-02 12:05:31.013989] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.445 Running I/O for 1 seconds... 00:05:45.445 lcore 0: 195383 00:05:45.445 lcore 1: 195379 00:05:45.445 lcore 2: 195381 00:05:45.445 lcore 3: 195381 00:05:45.445 done. 00:05:45.445 00:05:45.445 real 0m1.184s 00:05:45.445 user 0m4.088s 00:05:45.445 sys 0m0.094s 00:05:45.445 12:05:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:45.445 12:05:32 -- common/autotest_common.sh@10 -- # set +x 00:05:45.445 ************************************ 00:05:45.445 END TEST event_perf 00:05:45.445 ************************************ 00:05:45.445 12:05:32 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:45.445 12:05:32 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:45.445 12:05:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:45.445 12:05:32 -- common/autotest_common.sh@10 -- # set +x 00:05:45.445 ************************************ 00:05:45.445 START TEST event_reactor 00:05:45.445 ************************************ 00:05:45.445 12:05:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:45.445 [2024-11-02 12:05:32.138816] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:45.445 [2024-11-02 12:05:32.138913] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1123488 ] 00:05:45.445 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.445 [2024-11-02 12:05:32.206586] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.445 [2024-11-02 12:05:32.242226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.381 test_start 00:05:46.381 oneshot 00:05:46.381 tick 100 00:05:46.381 tick 100 00:05:46.381 tick 250 00:05:46.381 tick 100 00:05:46.381 tick 100 00:05:46.381 tick 100 00:05:46.381 tick 250 00:05:46.381 tick 500 00:05:46.381 tick 100 00:05:46.381 tick 100 00:05:46.381 tick 250 00:05:46.381 tick 100 00:05:46.381 tick 100 00:05:46.381 test_end 00:05:46.381 00:05:46.381 real 0m1.174s 00:05:46.381 user 0m1.086s 00:05:46.381 sys 0m0.083s 00:05:46.381 12:05:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:46.381 12:05:33 -- common/autotest_common.sh@10 -- # set +x 00:05:46.381 ************************************ 00:05:46.381 END TEST event_reactor 00:05:46.381 ************************************ 00:05:46.381 12:05:33 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:46.381 12:05:33 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:46.381 12:05:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:46.381 12:05:33 -- common/autotest_common.sh@10 -- # set +x 00:05:46.381 ************************************ 00:05:46.381 START TEST event_reactor_perf 00:05:46.381 ************************************ 00:05:46.381 12:05:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:46.641 [2024-11-02 12:05:33.362532] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:46.641 [2024-11-02 12:05:33.362628] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1123711 ] 00:05:46.641 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.641 [2024-11-02 12:05:33.430360] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.641 [2024-11-02 12:05:33.465950] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.576 test_start 00:05:47.576 test_end 00:05:47.576 Performance: 972544 events per second 00:05:47.576 00:05:47.576 real 0m1.173s 00:05:47.576 user 0m1.088s 00:05:47.576 sys 0m0.081s 00:05:47.576 12:05:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.576 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:47.576 ************************************ 00:05:47.576 END TEST event_reactor_perf 00:05:47.576 ************************************ 00:05:47.835 12:05:34 -- event/event.sh@49 -- # uname -s 00:05:47.835 12:05:34 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:47.835 12:05:34 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:47.835 12:05:34 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:47.835 12:05:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:47.835 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:47.835 ************************************ 00:05:47.835 START TEST event_scheduler 00:05:47.835 ************************************ 00:05:47.835 12:05:34 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:47.835 * Looking for test storage... 00:05:47.835 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:47.835 12:05:34 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:47.835 12:05:34 -- scheduler/scheduler.sh@35 -- # scheduler_pid=1124022 00:05:47.835 12:05:34 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:47.835 12:05:34 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:47.835 12:05:34 -- scheduler/scheduler.sh@37 -- # waitforlisten 1124022 00:05:47.835 12:05:34 -- common/autotest_common.sh@819 -- # '[' -z 1124022 ']' 00:05:47.835 12:05:34 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.835 12:05:34 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:47.835 12:05:34 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.835 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.835 12:05:34 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:47.835 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:47.835 [2024-11-02 12:05:34.699566] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:47.835 [2024-11-02 12:05:34.699657] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1124022 ] 00:05:47.835 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.835 [2024-11-02 12:05:34.764756] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:47.835 [2024-11-02 12:05:34.804575] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.835 [2024-11-02 12:05:34.804661] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:47.835 [2024-11-02 12:05:34.804750] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:47.835 [2024-11-02 12:05:34.804752] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:48.094 12:05:34 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:48.094 12:05:34 -- common/autotest_common.sh@852 -- # return 0 00:05:48.094 12:05:34 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:48.094 12:05:34 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.094 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:48.094 POWER: Env isn't set yet! 00:05:48.094 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:48.094 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:48.094 POWER: Cannot set governor of lcore 0 to userspace 00:05:48.094 POWER: Attempting to initialise PSTAT power management... 00:05:48.094 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:48.094 POWER: Initialized successfully for lcore 0 power management 00:05:48.094 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:48.094 POWER: Initialized successfully for lcore 1 power management 00:05:48.094 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:48.094 POWER: Initialized successfully for lcore 2 power management 00:05:48.094 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:48.094 POWER: Initialized successfully for lcore 3 power management 00:05:48.094 [2024-11-02 12:05:34.907174] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:48.094 [2024-11-02 12:05:34.907189] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:48.094 [2024-11-02 12:05:34.907198] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:48.094 12:05:34 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.094 12:05:34 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:48.094 12:05:34 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.094 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:48.094 [2024-11-02 12:05:34.969160] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:48.094 12:05:34 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.094 12:05:34 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:48.094 12:05:34 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.094 12:05:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.094 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:48.094 ************************************ 00:05:48.094 START TEST scheduler_create_thread 00:05:48.094 ************************************ 00:05:48.094 12:05:34 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:48.094 12:05:34 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:48.094 12:05:34 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.094 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:48.094 2 00:05:48.094 12:05:34 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.094 12:05:34 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:48.094 12:05:34 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.094 12:05:34 -- common/autotest_common.sh@10 -- # set +x 00:05:48.094 3 00:05:48.094 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.094 12:05:35 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:48.095 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.095 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.095 4 00:05:48.095 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.095 12:05:35 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:48.095 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.095 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.095 5 00:05:48.095 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.095 12:05:35 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:48.095 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.095 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.095 6 00:05:48.095 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.095 12:05:35 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:48.095 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.095 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.095 7 00:05:48.095 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.095 12:05:35 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:48.095 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.095 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.095 8 00:05:48.095 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.095 12:05:35 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:48.095 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.095 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.095 9 00:05:48.095 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.095 12:05:35 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:48.095 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.095 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.353 10 00:05:48.353 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.353 12:05:35 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:48.353 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.353 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.353 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.353 12:05:35 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:48.353 12:05:35 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:48.353 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.353 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:49.289 12:05:35 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:49.289 12:05:35 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:49.289 12:05:35 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:49.289 12:05:35 -- common/autotest_common.sh@10 -- # set +x 00:05:50.666 12:05:37 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:50.666 12:05:37 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:50.666 12:05:37 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:50.666 12:05:37 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:50.666 12:05:37 -- common/autotest_common.sh@10 -- # set +x 00:05:51.602 12:05:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:51.602 00:05:51.602 real 0m3.382s 00:05:51.602 user 0m0.028s 00:05:51.602 sys 0m0.003s 00:05:51.602 12:05:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.602 12:05:38 -- common/autotest_common.sh@10 -- # set +x 00:05:51.602 ************************************ 00:05:51.602 END TEST scheduler_create_thread 00:05:51.602 ************************************ 00:05:51.602 12:05:38 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:51.602 12:05:38 -- scheduler/scheduler.sh@46 -- # killprocess 1124022 00:05:51.602 12:05:38 -- common/autotest_common.sh@926 -- # '[' -z 1124022 ']' 00:05:51.602 12:05:38 -- common/autotest_common.sh@930 -- # kill -0 1124022 00:05:51.602 12:05:38 -- common/autotest_common.sh@931 -- # uname 00:05:51.602 12:05:38 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:51.602 12:05:38 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1124022 00:05:51.602 12:05:38 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:05:51.602 12:05:38 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:05:51.602 12:05:38 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1124022' 00:05:51.602 killing process with pid 1124022 00:05:51.602 12:05:38 -- common/autotest_common.sh@945 -- # kill 1124022 00:05:51.602 12:05:38 -- common/autotest_common.sh@950 -- # wait 1124022 00:05:51.861 [2024-11-02 12:05:38.740900] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:51.861 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:51.861 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:51.861 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:51.861 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:51.861 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:51.861 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:51.861 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:51.861 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:52.120 00:05:52.120 real 0m4.378s 00:05:52.120 user 0m7.796s 00:05:52.120 sys 0m0.374s 00:05:52.120 12:05:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:52.120 12:05:38 -- common/autotest_common.sh@10 -- # set +x 00:05:52.120 ************************************ 00:05:52.120 END TEST event_scheduler 00:05:52.120 ************************************ 00:05:52.120 12:05:38 -- event/event.sh@51 -- # modprobe -n nbd 00:05:52.120 12:05:39 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:52.120 12:05:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:52.120 12:05:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:52.120 12:05:39 -- common/autotest_common.sh@10 -- # set +x 00:05:52.120 ************************************ 00:05:52.120 START TEST app_repeat 00:05:52.120 ************************************ 00:05:52.120 12:05:39 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:05:52.120 12:05:39 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:52.120 12:05:39 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:52.120 12:05:39 -- event/event.sh@13 -- # local nbd_list 00:05:52.120 12:05:39 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:52.120 12:05:39 -- event/event.sh@14 -- # local bdev_list 00:05:52.120 12:05:39 -- event/event.sh@15 -- # local repeat_times=4 00:05:52.120 12:05:39 -- event/event.sh@17 -- # modprobe nbd 00:05:52.120 12:05:39 -- event/event.sh@19 -- # repeat_pid=1124880 00:05:52.120 12:05:39 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:52.120 12:05:39 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:52.120 12:05:39 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1124880' 00:05:52.120 Process app_repeat pid: 1124880 00:05:52.120 12:05:39 -- event/event.sh@23 -- # for i in {0..2} 00:05:52.120 12:05:39 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:52.120 spdk_app_start Round 0 00:05:52.120 12:05:39 -- event/event.sh@25 -- # waitforlisten 1124880 /var/tmp/spdk-nbd.sock 00:05:52.120 12:05:39 -- common/autotest_common.sh@819 -- # '[' -z 1124880 ']' 00:05:52.120 12:05:39 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:52.120 12:05:39 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:52.120 12:05:39 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:52.120 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:52.120 12:05:39 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:52.120 12:05:39 -- common/autotest_common.sh@10 -- # set +x 00:05:52.120 [2024-11-02 12:05:39.039246] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:05:52.120 [2024-11-02 12:05:39.039332] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1124880 ] 00:05:52.120 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.445 [2024-11-02 12:05:39.110068] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:52.445 [2024-11-02 12:05:39.145380] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:52.445 [2024-11-02 12:05:39.145382] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.083 12:05:39 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:53.083 12:05:39 -- common/autotest_common.sh@852 -- # return 0 00:05:53.083 12:05:39 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:53.083 Malloc0 00:05:53.083 12:05:40 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:53.341 Malloc1 00:05:53.341 12:05:40 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@12 -- # local i 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:53.341 12:05:40 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:53.600 /dev/nbd0 00:05:53.600 12:05:40 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:53.600 12:05:40 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:53.600 12:05:40 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:53.600 12:05:40 -- common/autotest_common.sh@857 -- # local i 00:05:53.600 12:05:40 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:53.600 12:05:40 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:53.600 12:05:40 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:53.600 12:05:40 -- common/autotest_common.sh@861 -- # break 00:05:53.600 12:05:40 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:53.600 12:05:40 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:53.600 12:05:40 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:53.600 1+0 records in 00:05:53.600 1+0 records out 00:05:53.600 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256567 s, 16.0 MB/s 00:05:53.600 12:05:40 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:53.600 12:05:40 -- common/autotest_common.sh@874 -- # size=4096 00:05:53.600 12:05:40 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:53.600 12:05:40 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:53.600 12:05:40 -- common/autotest_common.sh@877 -- # return 0 00:05:53.600 12:05:40 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:53.600 12:05:40 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:53.600 12:05:40 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:53.859 /dev/nbd1 00:05:53.859 12:05:40 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:53.859 12:05:40 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:53.859 12:05:40 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:53.859 12:05:40 -- common/autotest_common.sh@857 -- # local i 00:05:53.859 12:05:40 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:53.859 12:05:40 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:53.859 12:05:40 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:53.859 12:05:40 -- common/autotest_common.sh@861 -- # break 00:05:53.859 12:05:40 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:53.859 12:05:40 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:53.859 12:05:40 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:53.859 1+0 records in 00:05:53.859 1+0 records out 00:05:53.859 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236803 s, 17.3 MB/s 00:05:53.859 12:05:40 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:53.859 12:05:40 -- common/autotest_common.sh@874 -- # size=4096 00:05:53.859 12:05:40 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:53.859 12:05:40 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:53.859 12:05:40 -- common/autotest_common.sh@877 -- # return 0 00:05:53.859 12:05:40 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:53.859 12:05:40 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:53.859 12:05:40 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:53.859 12:05:40 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.859 12:05:40 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:54.118 { 00:05:54.118 "nbd_device": "/dev/nbd0", 00:05:54.118 "bdev_name": "Malloc0" 00:05:54.118 }, 00:05:54.118 { 00:05:54.118 "nbd_device": "/dev/nbd1", 00:05:54.118 "bdev_name": "Malloc1" 00:05:54.118 } 00:05:54.118 ]' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:54.118 { 00:05:54.118 "nbd_device": "/dev/nbd0", 00:05:54.118 "bdev_name": "Malloc0" 00:05:54.118 }, 00:05:54.118 { 00:05:54.118 "nbd_device": "/dev/nbd1", 00:05:54.118 "bdev_name": "Malloc1" 00:05:54.118 } 00:05:54.118 ]' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:54.118 /dev/nbd1' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:54.118 /dev/nbd1' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@65 -- # count=2 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@95 -- # count=2 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:54.118 256+0 records in 00:05:54.118 256+0 records out 00:05:54.118 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116417 s, 90.1 MB/s 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:54.118 256+0 records in 00:05:54.118 256+0 records out 00:05:54.118 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0196376 s, 53.4 MB/s 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:54.118 256+0 records in 00:05:54.118 256+0 records out 00:05:54.118 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212989 s, 49.2 MB/s 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:54.118 12:05:40 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@51 -- # local i 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:54.118 12:05:41 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@41 -- # break 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@45 -- # return 0 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:54.377 12:05:41 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@41 -- # break 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@45 -- # return 0 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:54.636 12:05:41 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@65 -- # true 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@65 -- # count=0 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@104 -- # count=0 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:54.895 12:05:41 -- bdev/nbd_common.sh@109 -- # return 0 00:05:54.895 12:05:41 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:54.895 12:05:41 -- event/event.sh@35 -- # sleep 3 00:05:55.154 [2024-11-02 12:05:41.987822] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:55.154 [2024-11-02 12:05:42.020788] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:55.154 [2024-11-02 12:05:42.020790] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.154 [2024-11-02 12:05:42.059921] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:55.154 [2024-11-02 12:05:42.059966] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:58.441 12:05:44 -- event/event.sh@23 -- # for i in {0..2} 00:05:58.441 12:05:44 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:58.441 spdk_app_start Round 1 00:05:58.441 12:05:44 -- event/event.sh@25 -- # waitforlisten 1124880 /var/tmp/spdk-nbd.sock 00:05:58.441 12:05:44 -- common/autotest_common.sh@819 -- # '[' -z 1124880 ']' 00:05:58.441 12:05:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:58.441 12:05:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:58.441 12:05:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:58.441 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:58.441 12:05:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:58.441 12:05:44 -- common/autotest_common.sh@10 -- # set +x 00:05:58.441 12:05:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:58.441 12:05:44 -- common/autotest_common.sh@852 -- # return 0 00:05:58.441 12:05:44 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:58.441 Malloc0 00:05:58.441 12:05:45 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:58.441 Malloc1 00:05:58.441 12:05:45 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@12 -- # local i 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:58.441 12:05:45 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:58.700 /dev/nbd0 00:05:58.700 12:05:45 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:58.700 12:05:45 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:58.700 12:05:45 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:05:58.700 12:05:45 -- common/autotest_common.sh@857 -- # local i 00:05:58.700 12:05:45 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:58.700 12:05:45 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:58.700 12:05:45 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:05:58.700 12:05:45 -- common/autotest_common.sh@861 -- # break 00:05:58.700 12:05:45 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:58.700 12:05:45 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:58.700 12:05:45 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:58.700 1+0 records in 00:05:58.700 1+0 records out 00:05:58.700 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242409 s, 16.9 MB/s 00:05:58.700 12:05:45 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:58.700 12:05:45 -- common/autotest_common.sh@874 -- # size=4096 00:05:58.700 12:05:45 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:58.700 12:05:45 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:58.700 12:05:45 -- common/autotest_common.sh@877 -- # return 0 00:05:58.700 12:05:45 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:58.700 12:05:45 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:58.700 12:05:45 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:58.958 /dev/nbd1 00:05:58.958 12:05:45 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:58.958 12:05:45 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:58.958 12:05:45 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:05:58.958 12:05:45 -- common/autotest_common.sh@857 -- # local i 00:05:58.958 12:05:45 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:05:58.958 12:05:45 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:05:58.959 12:05:45 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:05:58.959 12:05:45 -- common/autotest_common.sh@861 -- # break 00:05:58.959 12:05:45 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:05:58.959 12:05:45 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:05:58.959 12:05:45 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:58.959 1+0 records in 00:05:58.959 1+0 records out 00:05:58.959 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292811 s, 14.0 MB/s 00:05:58.959 12:05:45 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:58.959 12:05:45 -- common/autotest_common.sh@874 -- # size=4096 00:05:58.959 12:05:45 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:58.959 12:05:45 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:05:58.959 12:05:45 -- common/autotest_common.sh@877 -- # return 0 00:05:58.959 12:05:45 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:58.959 12:05:45 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:58.959 12:05:45 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:58.959 12:05:45 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.959 12:05:45 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:59.217 12:05:45 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:59.217 { 00:05:59.217 "nbd_device": "/dev/nbd0", 00:05:59.217 "bdev_name": "Malloc0" 00:05:59.217 }, 00:05:59.217 { 00:05:59.217 "nbd_device": "/dev/nbd1", 00:05:59.217 "bdev_name": "Malloc1" 00:05:59.217 } 00:05:59.217 ]' 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:59.217 { 00:05:59.217 "nbd_device": "/dev/nbd0", 00:05:59.217 "bdev_name": "Malloc0" 00:05:59.217 }, 00:05:59.217 { 00:05:59.217 "nbd_device": "/dev/nbd1", 00:05:59.217 "bdev_name": "Malloc1" 00:05:59.217 } 00:05:59.217 ]' 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:59.217 /dev/nbd1' 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:59.217 /dev/nbd1' 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@65 -- # count=2 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@95 -- # count=2 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:59.217 256+0 records in 00:05:59.217 256+0 records out 00:05:59.217 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104422 s, 100 MB/s 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:59.217 256+0 records in 00:05:59.217 256+0 records out 00:05:59.217 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0192999 s, 54.3 MB/s 00:05:59.217 12:05:46 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:59.218 256+0 records in 00:05:59.218 256+0 records out 00:05:59.218 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214064 s, 49.0 MB/s 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@51 -- # local i 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:59.218 12:05:46 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@41 -- # break 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@45 -- # return 0 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:59.476 12:05:46 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@41 -- # break 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@45 -- # return 0 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:59.735 12:05:46 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:59.993 12:05:46 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:59.993 12:05:46 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:59.993 12:05:46 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:59.993 12:05:46 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:59.993 12:05:46 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:59.993 12:05:46 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:59.993 12:05:46 -- bdev/nbd_common.sh@65 -- # true 00:05:59.994 12:05:46 -- bdev/nbd_common.sh@65 -- # count=0 00:05:59.994 12:05:46 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:59.994 12:05:46 -- bdev/nbd_common.sh@104 -- # count=0 00:05:59.994 12:05:46 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:59.994 12:05:46 -- bdev/nbd_common.sh@109 -- # return 0 00:05:59.994 12:05:46 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:59.994 12:05:46 -- event/event.sh@35 -- # sleep 3 00:06:00.252 [2024-11-02 12:05:47.129711] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:00.252 [2024-11-02 12:05:47.163268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:00.252 [2024-11-02 12:05:47.163270] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.252 [2024-11-02 12:05:47.202953] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:00.252 [2024-11-02 12:05:47.203000] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:03.537 12:05:49 -- event/event.sh@23 -- # for i in {0..2} 00:06:03.537 12:05:49 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:03.537 spdk_app_start Round 2 00:06:03.537 12:05:49 -- event/event.sh@25 -- # waitforlisten 1124880 /var/tmp/spdk-nbd.sock 00:06:03.537 12:05:49 -- common/autotest_common.sh@819 -- # '[' -z 1124880 ']' 00:06:03.537 12:05:49 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:03.537 12:05:49 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:03.537 12:05:49 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:03.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:03.537 12:05:49 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:03.537 12:05:49 -- common/autotest_common.sh@10 -- # set +x 00:06:03.537 12:05:50 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:03.537 12:05:50 -- common/autotest_common.sh@852 -- # return 0 00:06:03.537 12:05:50 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:03.537 Malloc0 00:06:03.537 12:05:50 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:03.537 Malloc1 00:06:03.797 12:05:50 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@12 -- # local i 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:03.797 /dev/nbd0 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:03.797 12:05:50 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:03.797 12:05:50 -- common/autotest_common.sh@857 -- # local i 00:06:03.797 12:05:50 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:03.797 12:05:50 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:03.797 12:05:50 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:03.797 12:05:50 -- common/autotest_common.sh@861 -- # break 00:06:03.797 12:05:50 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:03.797 12:05:50 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:03.797 12:05:50 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:03.797 1+0 records in 00:06:03.797 1+0 records out 00:06:03.797 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260425 s, 15.7 MB/s 00:06:03.797 12:05:50 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.797 12:05:50 -- common/autotest_common.sh@874 -- # size=4096 00:06:03.797 12:05:50 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.797 12:05:50 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:03.797 12:05:50 -- common/autotest_common.sh@877 -- # return 0 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.797 12:05:50 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:04.057 /dev/nbd1 00:06:04.057 12:05:50 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:04.057 12:05:50 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:04.057 12:05:50 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:04.057 12:05:50 -- common/autotest_common.sh@857 -- # local i 00:06:04.057 12:05:50 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:04.057 12:05:50 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:04.057 12:05:50 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:04.057 12:05:50 -- common/autotest_common.sh@861 -- # break 00:06:04.057 12:05:50 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:04.057 12:05:50 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:04.057 12:05:50 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:04.057 1+0 records in 00:06:04.057 1+0 records out 00:06:04.057 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240817 s, 17.0 MB/s 00:06:04.057 12:05:50 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.057 12:05:50 -- common/autotest_common.sh@874 -- # size=4096 00:06:04.057 12:05:50 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.057 12:05:50 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:04.057 12:05:50 -- common/autotest_common.sh@877 -- # return 0 00:06:04.057 12:05:50 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:04.057 12:05:50 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.057 12:05:50 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:04.057 12:05:50 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.057 12:05:50 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:04.316 { 00:06:04.316 "nbd_device": "/dev/nbd0", 00:06:04.316 "bdev_name": "Malloc0" 00:06:04.316 }, 00:06:04.316 { 00:06:04.316 "nbd_device": "/dev/nbd1", 00:06:04.316 "bdev_name": "Malloc1" 00:06:04.316 } 00:06:04.316 ]' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:04.316 { 00:06:04.316 "nbd_device": "/dev/nbd0", 00:06:04.316 "bdev_name": "Malloc0" 00:06:04.316 }, 00:06:04.316 { 00:06:04.316 "nbd_device": "/dev/nbd1", 00:06:04.316 "bdev_name": "Malloc1" 00:06:04.316 } 00:06:04.316 ]' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:04.316 /dev/nbd1' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:04.316 /dev/nbd1' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@65 -- # count=2 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@95 -- # count=2 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:04.316 256+0 records in 00:06:04.316 256+0 records out 00:06:04.316 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106136 s, 98.8 MB/s 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:04.316 256+0 records in 00:06:04.316 256+0 records out 00:06:04.316 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202944 s, 51.7 MB/s 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:04.316 256+0 records in 00:06:04.316 256+0 records out 00:06:04.316 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210726 s, 49.8 MB/s 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@51 -- # local i 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:04.316 12:05:51 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:04.575 12:05:51 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:04.575 12:05:51 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:04.575 12:05:51 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:04.575 12:05:51 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:04.575 12:05:51 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:04.576 12:05:51 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:04.576 12:05:51 -- bdev/nbd_common.sh@41 -- # break 00:06:04.576 12:05:51 -- bdev/nbd_common.sh@45 -- # return 0 00:06:04.576 12:05:51 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:04.576 12:05:51 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@41 -- # break 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@45 -- # return 0 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.835 12:05:51 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@65 -- # true 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@65 -- # count=0 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@104 -- # count=0 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:05.094 12:05:51 -- bdev/nbd_common.sh@109 -- # return 0 00:06:05.094 12:05:51 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:05.353 12:05:52 -- event/event.sh@35 -- # sleep 3 00:06:05.353 [2024-11-02 12:05:52.263484] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:05.353 [2024-11-02 12:05:52.296775] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:05.353 [2024-11-02 12:05:52.296777] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.613 [2024-11-02 12:05:52.336488] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:05.613 [2024-11-02 12:05:52.336529] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:08.149 12:05:55 -- event/event.sh@38 -- # waitforlisten 1124880 /var/tmp/spdk-nbd.sock 00:06:08.149 12:05:55 -- common/autotest_common.sh@819 -- # '[' -z 1124880 ']' 00:06:08.149 12:05:55 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:08.149 12:05:55 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:08.149 12:05:55 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:08.149 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:08.149 12:05:55 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:08.149 12:05:55 -- common/autotest_common.sh@10 -- # set +x 00:06:08.408 12:05:55 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:08.408 12:05:55 -- common/autotest_common.sh@852 -- # return 0 00:06:08.408 12:05:55 -- event/event.sh@39 -- # killprocess 1124880 00:06:08.408 12:05:55 -- common/autotest_common.sh@926 -- # '[' -z 1124880 ']' 00:06:08.408 12:05:55 -- common/autotest_common.sh@930 -- # kill -0 1124880 00:06:08.408 12:05:55 -- common/autotest_common.sh@931 -- # uname 00:06:08.408 12:05:55 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:08.408 12:05:55 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1124880 00:06:08.408 12:05:55 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:08.408 12:05:55 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:08.408 12:05:55 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1124880' 00:06:08.408 killing process with pid 1124880 00:06:08.408 12:05:55 -- common/autotest_common.sh@945 -- # kill 1124880 00:06:08.408 12:05:55 -- common/autotest_common.sh@950 -- # wait 1124880 00:06:08.667 spdk_app_start is called in Round 0. 00:06:08.667 Shutdown signal received, stop current app iteration 00:06:08.667 Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 reinitialization... 00:06:08.667 spdk_app_start is called in Round 1. 00:06:08.667 Shutdown signal received, stop current app iteration 00:06:08.667 Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 reinitialization... 00:06:08.667 spdk_app_start is called in Round 2. 00:06:08.667 Shutdown signal received, stop current app iteration 00:06:08.667 Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 reinitialization... 00:06:08.667 spdk_app_start is called in Round 3. 00:06:08.667 Shutdown signal received, stop current app iteration 00:06:08.667 12:05:55 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:08.667 12:05:55 -- event/event.sh@42 -- # return 0 00:06:08.667 00:06:08.667 real 0m16.458s 00:06:08.667 user 0m35.258s 00:06:08.667 sys 0m3.097s 00:06:08.667 12:05:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.667 12:05:55 -- common/autotest_common.sh@10 -- # set +x 00:06:08.667 ************************************ 00:06:08.667 END TEST app_repeat 00:06:08.667 ************************************ 00:06:08.667 12:05:55 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:08.667 12:05:55 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:08.667 12:05:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:08.667 12:05:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:08.667 12:05:55 -- common/autotest_common.sh@10 -- # set +x 00:06:08.667 ************************************ 00:06:08.667 START TEST cpu_locks 00:06:08.667 ************************************ 00:06:08.667 12:05:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:08.667 * Looking for test storage... 00:06:08.667 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:08.667 12:05:55 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:08.667 12:05:55 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:08.667 12:05:55 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:08.667 12:05:55 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:08.667 12:05:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:08.667 12:05:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:08.667 12:05:55 -- common/autotest_common.sh@10 -- # set +x 00:06:08.667 ************************************ 00:06:08.667 START TEST default_locks 00:06:08.667 ************************************ 00:06:08.667 12:05:55 -- common/autotest_common.sh@1104 -- # default_locks 00:06:08.667 12:05:55 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1128015 00:06:08.667 12:05:55 -- event/cpu_locks.sh@47 -- # waitforlisten 1128015 00:06:08.668 12:05:55 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:08.668 12:05:55 -- common/autotest_common.sh@819 -- # '[' -z 1128015 ']' 00:06:08.668 12:05:55 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:08.668 12:05:55 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:08.668 12:05:55 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:08.668 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:08.668 12:05:55 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:08.668 12:05:55 -- common/autotest_common.sh@10 -- # set +x 00:06:08.926 [2024-11-02 12:05:55.655016] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:08.926 [2024-11-02 12:05:55.655105] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1128015 ] 00:06:08.926 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.926 [2024-11-02 12:05:55.723797] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.926 [2024-11-02 12:05:55.761194] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:08.926 [2024-11-02 12:05:55.761309] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.862 12:05:56 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:09.862 12:05:56 -- common/autotest_common.sh@852 -- # return 0 00:06:09.862 12:05:56 -- event/cpu_locks.sh@49 -- # locks_exist 1128015 00:06:09.862 12:05:56 -- event/cpu_locks.sh@22 -- # lslocks -p 1128015 00:06:09.862 12:05:56 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:10.121 lslocks: write error 00:06:10.121 12:05:56 -- event/cpu_locks.sh@50 -- # killprocess 1128015 00:06:10.121 12:05:56 -- common/autotest_common.sh@926 -- # '[' -z 1128015 ']' 00:06:10.121 12:05:56 -- common/autotest_common.sh@930 -- # kill -0 1128015 00:06:10.121 12:05:56 -- common/autotest_common.sh@931 -- # uname 00:06:10.121 12:05:56 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:10.121 12:05:56 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1128015 00:06:10.121 12:05:56 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:10.121 12:05:56 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:10.121 12:05:56 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1128015' 00:06:10.121 killing process with pid 1128015 00:06:10.121 12:05:56 -- common/autotest_common.sh@945 -- # kill 1128015 00:06:10.121 12:05:56 -- common/autotest_common.sh@950 -- # wait 1128015 00:06:10.380 12:05:57 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1128015 00:06:10.380 12:05:57 -- common/autotest_common.sh@640 -- # local es=0 00:06:10.380 12:05:57 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 1128015 00:06:10.380 12:05:57 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:10.380 12:05:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:10.380 12:05:57 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:10.380 12:05:57 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:10.380 12:05:57 -- common/autotest_common.sh@643 -- # waitforlisten 1128015 00:06:10.380 12:05:57 -- common/autotest_common.sh@819 -- # '[' -z 1128015 ']' 00:06:10.380 12:05:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.380 12:05:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:10.380 12:05:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.380 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.380 12:05:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:10.380 12:05:57 -- common/autotest_common.sh@10 -- # set +x 00:06:10.380 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (1128015) - No such process 00:06:10.380 ERROR: process (pid: 1128015) is no longer running 00:06:10.380 12:05:57 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:10.380 12:05:57 -- common/autotest_common.sh@852 -- # return 1 00:06:10.380 12:05:57 -- common/autotest_common.sh@643 -- # es=1 00:06:10.380 12:05:57 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:10.380 12:05:57 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:10.380 12:05:57 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:10.380 12:05:57 -- event/cpu_locks.sh@54 -- # no_locks 00:06:10.380 12:05:57 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:10.380 12:05:57 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:10.380 12:05:57 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:10.380 00:06:10.380 real 0m1.593s 00:06:10.380 user 0m1.680s 00:06:10.380 sys 0m0.569s 00:06:10.380 12:05:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.380 12:05:57 -- common/autotest_common.sh@10 -- # set +x 00:06:10.380 ************************************ 00:06:10.380 END TEST default_locks 00:06:10.380 ************************************ 00:06:10.380 12:05:57 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:10.380 12:05:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:10.380 12:05:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:10.380 12:05:57 -- common/autotest_common.sh@10 -- # set +x 00:06:10.380 ************************************ 00:06:10.380 START TEST default_locks_via_rpc 00:06:10.380 ************************************ 00:06:10.380 12:05:57 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:06:10.380 12:05:57 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1128370 00:06:10.380 12:05:57 -- event/cpu_locks.sh@63 -- # waitforlisten 1128370 00:06:10.380 12:05:57 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:10.380 12:05:57 -- common/autotest_common.sh@819 -- # '[' -z 1128370 ']' 00:06:10.380 12:05:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.380 12:05:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:10.380 12:05:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.380 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.380 12:05:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:10.380 12:05:57 -- common/autotest_common.sh@10 -- # set +x 00:06:10.380 [2024-11-02 12:05:57.297188] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:10.380 [2024-11-02 12:05:57.297258] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1128370 ] 00:06:10.380 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.639 [2024-11-02 12:05:57.364081] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.639 [2024-11-02 12:05:57.398562] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:10.639 [2024-11-02 12:05:57.398694] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.205 12:05:58 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:11.205 12:05:58 -- common/autotest_common.sh@852 -- # return 0 00:06:11.205 12:05:58 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:11.205 12:05:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:11.205 12:05:58 -- common/autotest_common.sh@10 -- # set +x 00:06:11.205 12:05:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:11.205 12:05:58 -- event/cpu_locks.sh@67 -- # no_locks 00:06:11.205 12:05:58 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:11.205 12:05:58 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:11.205 12:05:58 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:11.205 12:05:58 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:11.205 12:05:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:11.205 12:05:58 -- common/autotest_common.sh@10 -- # set +x 00:06:11.205 12:05:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:11.205 12:05:58 -- event/cpu_locks.sh@71 -- # locks_exist 1128370 00:06:11.205 12:05:58 -- event/cpu_locks.sh@22 -- # lslocks -p 1128370 00:06:11.205 12:05:58 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:11.464 12:05:58 -- event/cpu_locks.sh@73 -- # killprocess 1128370 00:06:11.464 12:05:58 -- common/autotest_common.sh@926 -- # '[' -z 1128370 ']' 00:06:11.464 12:05:58 -- common/autotest_common.sh@930 -- # kill -0 1128370 00:06:11.464 12:05:58 -- common/autotest_common.sh@931 -- # uname 00:06:11.464 12:05:58 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:11.464 12:05:58 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1128370 00:06:11.723 12:05:58 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:11.723 12:05:58 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:11.723 12:05:58 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1128370' 00:06:11.723 killing process with pid 1128370 00:06:11.723 12:05:58 -- common/autotest_common.sh@945 -- # kill 1128370 00:06:11.723 12:05:58 -- common/autotest_common.sh@950 -- # wait 1128370 00:06:11.981 00:06:11.981 real 0m1.505s 00:06:11.982 user 0m1.596s 00:06:11.982 sys 0m0.497s 00:06:11.982 12:05:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.982 12:05:58 -- common/autotest_common.sh@10 -- # set +x 00:06:11.982 ************************************ 00:06:11.982 END TEST default_locks_via_rpc 00:06:11.982 ************************************ 00:06:11.982 12:05:58 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:11.982 12:05:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:11.982 12:05:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:11.982 12:05:58 -- common/autotest_common.sh@10 -- # set +x 00:06:11.982 ************************************ 00:06:11.982 START TEST non_locking_app_on_locked_coremask 00:06:11.982 ************************************ 00:06:11.982 12:05:58 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:06:11.982 12:05:58 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1128667 00:06:11.982 12:05:58 -- event/cpu_locks.sh@81 -- # waitforlisten 1128667 /var/tmp/spdk.sock 00:06:11.982 12:05:58 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:11.982 12:05:58 -- common/autotest_common.sh@819 -- # '[' -z 1128667 ']' 00:06:11.982 12:05:58 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.982 12:05:58 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:11.982 12:05:58 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.982 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.982 12:05:58 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:11.982 12:05:58 -- common/autotest_common.sh@10 -- # set +x 00:06:11.982 [2024-11-02 12:05:58.855266] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:11.982 [2024-11-02 12:05:58.855362] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1128667 ] 00:06:11.982 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.982 [2024-11-02 12:05:58.921273] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.240 [2024-11-02 12:05:58.959152] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:12.240 [2024-11-02 12:05:58.959259] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.811 12:05:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:12.811 12:05:59 -- common/autotest_common.sh@852 -- # return 0 00:06:12.811 12:05:59 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1128695 00:06:12.811 12:05:59 -- event/cpu_locks.sh@85 -- # waitforlisten 1128695 /var/tmp/spdk2.sock 00:06:12.811 12:05:59 -- common/autotest_common.sh@819 -- # '[' -z 1128695 ']' 00:06:12.811 12:05:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:12.811 12:05:59 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:12.811 12:05:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:12.811 12:05:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:12.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:12.811 12:05:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:12.811 12:05:59 -- common/autotest_common.sh@10 -- # set +x 00:06:12.811 [2024-11-02 12:05:59.705844] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:12.811 [2024-11-02 12:05:59.705895] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1128695 ] 00:06:12.811 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.071 [2024-11-02 12:05:59.787028] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:13.071 [2024-11-02 12:05:59.787052] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.071 [2024-11-02 12:05:59.858908] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:13.071 [2024-11-02 12:05:59.859041] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.640 12:06:00 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:13.640 12:06:00 -- common/autotest_common.sh@852 -- # return 0 00:06:13.640 12:06:00 -- event/cpu_locks.sh@87 -- # locks_exist 1128667 00:06:13.640 12:06:00 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:13.640 12:06:00 -- event/cpu_locks.sh@22 -- # lslocks -p 1128667 00:06:15.017 lslocks: write error 00:06:15.017 12:06:01 -- event/cpu_locks.sh@89 -- # killprocess 1128667 00:06:15.018 12:06:01 -- common/autotest_common.sh@926 -- # '[' -z 1128667 ']' 00:06:15.018 12:06:01 -- common/autotest_common.sh@930 -- # kill -0 1128667 00:06:15.018 12:06:01 -- common/autotest_common.sh@931 -- # uname 00:06:15.018 12:06:01 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:15.018 12:06:01 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1128667 00:06:15.018 12:06:01 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:15.018 12:06:01 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:15.018 12:06:01 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1128667' 00:06:15.018 killing process with pid 1128667 00:06:15.018 12:06:01 -- common/autotest_common.sh@945 -- # kill 1128667 00:06:15.018 12:06:01 -- common/autotest_common.sh@950 -- # wait 1128667 00:06:15.584 12:06:02 -- event/cpu_locks.sh@90 -- # killprocess 1128695 00:06:15.584 12:06:02 -- common/autotest_common.sh@926 -- # '[' -z 1128695 ']' 00:06:15.584 12:06:02 -- common/autotest_common.sh@930 -- # kill -0 1128695 00:06:15.584 12:06:02 -- common/autotest_common.sh@931 -- # uname 00:06:15.584 12:06:02 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:15.584 12:06:02 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1128695 00:06:15.585 12:06:02 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:15.585 12:06:02 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:15.585 12:06:02 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1128695' 00:06:15.585 killing process with pid 1128695 00:06:15.585 12:06:02 -- common/autotest_common.sh@945 -- # kill 1128695 00:06:15.585 12:06:02 -- common/autotest_common.sh@950 -- # wait 1128695 00:06:15.843 00:06:15.843 real 0m3.878s 00:06:15.843 user 0m4.160s 00:06:15.843 sys 0m1.254s 00:06:15.843 12:06:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.843 12:06:02 -- common/autotest_common.sh@10 -- # set +x 00:06:15.843 ************************************ 00:06:15.843 END TEST non_locking_app_on_locked_coremask 00:06:15.843 ************************************ 00:06:15.843 12:06:02 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:15.843 12:06:02 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:15.843 12:06:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:15.843 12:06:02 -- common/autotest_common.sh@10 -- # set +x 00:06:15.843 ************************************ 00:06:15.843 START TEST locking_app_on_unlocked_coremask 00:06:15.843 ************************************ 00:06:15.843 12:06:02 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:06:15.843 12:06:02 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1129397 00:06:15.843 12:06:02 -- event/cpu_locks.sh@99 -- # waitforlisten 1129397 /var/tmp/spdk.sock 00:06:15.843 12:06:02 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:15.843 12:06:02 -- common/autotest_common.sh@819 -- # '[' -z 1129397 ']' 00:06:15.843 12:06:02 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.843 12:06:02 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:15.843 12:06:02 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.843 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.843 12:06:02 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:15.843 12:06:02 -- common/autotest_common.sh@10 -- # set +x 00:06:15.843 [2024-11-02 12:06:02.776785] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:15.843 [2024-11-02 12:06:02.776881] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1129397 ] 00:06:15.843 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.102 [2024-11-02 12:06:02.843033] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:16.102 [2024-11-02 12:06:02.843059] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.102 [2024-11-02 12:06:02.880309] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.102 [2024-11-02 12:06:02.880435] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.669 12:06:03 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:16.669 12:06:03 -- common/autotest_common.sh@852 -- # return 0 00:06:16.669 12:06:03 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:16.669 12:06:03 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1129657 00:06:16.669 12:06:03 -- event/cpu_locks.sh@103 -- # waitforlisten 1129657 /var/tmp/spdk2.sock 00:06:16.669 12:06:03 -- common/autotest_common.sh@819 -- # '[' -z 1129657 ']' 00:06:16.669 12:06:03 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:16.669 12:06:03 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:16.669 12:06:03 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:16.669 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:16.669 12:06:03 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:16.669 12:06:03 -- common/autotest_common.sh@10 -- # set +x 00:06:16.669 [2024-11-02 12:06:03.620284] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:16.669 [2024-11-02 12:06:03.620336] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1129657 ] 00:06:16.927 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.927 [2024-11-02 12:06:03.710026] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.927 [2024-11-02 12:06:03.783389] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.927 [2024-11-02 12:06:03.783500] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.863 12:06:04 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:17.863 12:06:04 -- common/autotest_common.sh@852 -- # return 0 00:06:17.863 12:06:04 -- event/cpu_locks.sh@105 -- # locks_exist 1129657 00:06:17.863 12:06:04 -- event/cpu_locks.sh@22 -- # lslocks -p 1129657 00:06:17.863 12:06:04 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:18.430 lslocks: write error 00:06:18.430 12:06:05 -- event/cpu_locks.sh@107 -- # killprocess 1129397 00:06:18.430 12:06:05 -- common/autotest_common.sh@926 -- # '[' -z 1129397 ']' 00:06:18.430 12:06:05 -- common/autotest_common.sh@930 -- # kill -0 1129397 00:06:18.430 12:06:05 -- common/autotest_common.sh@931 -- # uname 00:06:18.430 12:06:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:18.430 12:06:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1129397 00:06:18.430 12:06:05 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:18.689 12:06:05 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:18.689 12:06:05 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1129397' 00:06:18.689 killing process with pid 1129397 00:06:18.689 12:06:05 -- common/autotest_common.sh@945 -- # kill 1129397 00:06:18.689 12:06:05 -- common/autotest_common.sh@950 -- # wait 1129397 00:06:19.255 12:06:05 -- event/cpu_locks.sh@108 -- # killprocess 1129657 00:06:19.255 12:06:05 -- common/autotest_common.sh@926 -- # '[' -z 1129657 ']' 00:06:19.255 12:06:05 -- common/autotest_common.sh@930 -- # kill -0 1129657 00:06:19.255 12:06:05 -- common/autotest_common.sh@931 -- # uname 00:06:19.255 12:06:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:19.255 12:06:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1129657 00:06:19.255 12:06:06 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:19.255 12:06:06 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:19.255 12:06:06 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1129657' 00:06:19.255 killing process with pid 1129657 00:06:19.255 12:06:06 -- common/autotest_common.sh@945 -- # kill 1129657 00:06:19.255 12:06:06 -- common/autotest_common.sh@950 -- # wait 1129657 00:06:19.514 00:06:19.514 real 0m3.576s 00:06:19.514 user 0m3.852s 00:06:19.514 sys 0m1.106s 00:06:19.514 12:06:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.514 12:06:06 -- common/autotest_common.sh@10 -- # set +x 00:06:19.514 ************************************ 00:06:19.514 END TEST locking_app_on_unlocked_coremask 00:06:19.514 ************************************ 00:06:19.514 12:06:06 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:19.514 12:06:06 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:19.514 12:06:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.514 12:06:06 -- common/autotest_common.sh@10 -- # set +x 00:06:19.514 ************************************ 00:06:19.514 START TEST locking_app_on_locked_coremask 00:06:19.514 ************************************ 00:06:19.514 12:06:06 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:06:19.514 12:06:06 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1130544 00:06:19.514 12:06:06 -- event/cpu_locks.sh@116 -- # waitforlisten 1130544 /var/tmp/spdk.sock 00:06:19.514 12:06:06 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:19.514 12:06:06 -- common/autotest_common.sh@819 -- # '[' -z 1130544 ']' 00:06:19.514 12:06:06 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.514 12:06:06 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:19.514 12:06:06 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.514 12:06:06 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:19.514 12:06:06 -- common/autotest_common.sh@10 -- # set +x 00:06:19.514 [2024-11-02 12:06:06.401249] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:19.514 [2024-11-02 12:06:06.401335] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1130544 ] 00:06:19.514 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.514 [2024-11-02 12:06:06.466705] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.773 [2024-11-02 12:06:06.504247] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:19.773 [2024-11-02 12:06:06.504355] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.340 12:06:07 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:20.340 12:06:07 -- common/autotest_common.sh@852 -- # return 0 00:06:20.340 12:06:07 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:20.340 12:06:07 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1130677 00:06:20.340 12:06:07 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1130677 /var/tmp/spdk2.sock 00:06:20.340 12:06:07 -- common/autotest_common.sh@640 -- # local es=0 00:06:20.340 12:06:07 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 1130677 /var/tmp/spdk2.sock 00:06:20.340 12:06:07 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:20.340 12:06:07 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:20.340 12:06:07 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:20.340 12:06:07 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:20.340 12:06:07 -- common/autotest_common.sh@643 -- # waitforlisten 1130677 /var/tmp/spdk2.sock 00:06:20.340 12:06:07 -- common/autotest_common.sh@819 -- # '[' -z 1130677 ']' 00:06:20.340 12:06:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:20.340 12:06:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:20.340 12:06:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:20.340 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:20.340 12:06:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:20.340 12:06:07 -- common/autotest_common.sh@10 -- # set +x 00:06:20.340 [2024-11-02 12:06:07.246650] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:20.340 [2024-11-02 12:06:07.246699] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1130677 ] 00:06:20.340 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.598 [2024-11-02 12:06:07.329793] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1130544 has claimed it. 00:06:20.598 [2024-11-02 12:06:07.329823] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:21.166 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (1130677) - No such process 00:06:21.166 ERROR: process (pid: 1130677) is no longer running 00:06:21.166 12:06:07 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:21.166 12:06:07 -- common/autotest_common.sh@852 -- # return 1 00:06:21.166 12:06:07 -- common/autotest_common.sh@643 -- # es=1 00:06:21.166 12:06:07 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:21.166 12:06:07 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:21.166 12:06:07 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:21.166 12:06:07 -- event/cpu_locks.sh@122 -- # locks_exist 1130544 00:06:21.166 12:06:07 -- event/cpu_locks.sh@22 -- # lslocks -p 1130544 00:06:21.166 12:06:07 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:21.425 lslocks: write error 00:06:21.425 12:06:08 -- event/cpu_locks.sh@124 -- # killprocess 1130544 00:06:21.425 12:06:08 -- common/autotest_common.sh@926 -- # '[' -z 1130544 ']' 00:06:21.425 12:06:08 -- common/autotest_common.sh@930 -- # kill -0 1130544 00:06:21.425 12:06:08 -- common/autotest_common.sh@931 -- # uname 00:06:21.425 12:06:08 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:21.425 12:06:08 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1130544 00:06:21.425 12:06:08 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:21.425 12:06:08 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:21.425 12:06:08 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1130544' 00:06:21.425 killing process with pid 1130544 00:06:21.425 12:06:08 -- common/autotest_common.sh@945 -- # kill 1130544 00:06:21.425 12:06:08 -- common/autotest_common.sh@950 -- # wait 1130544 00:06:21.685 00:06:21.685 real 0m2.203s 00:06:21.685 user 0m2.427s 00:06:21.685 sys 0m0.611s 00:06:21.685 12:06:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.685 12:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:21.685 ************************************ 00:06:21.685 END TEST locking_app_on_locked_coremask 00:06:21.685 ************************************ 00:06:21.685 12:06:08 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:21.685 12:06:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:21.685 12:06:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:21.685 12:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:21.685 ************************************ 00:06:21.685 START TEST locking_overlapped_coremask 00:06:21.685 ************************************ 00:06:21.685 12:06:08 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:06:21.685 12:06:08 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1130969 00:06:21.685 12:06:08 -- event/cpu_locks.sh@133 -- # waitforlisten 1130969 /var/tmp/spdk.sock 00:06:21.685 12:06:08 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:21.685 12:06:08 -- common/autotest_common.sh@819 -- # '[' -z 1130969 ']' 00:06:21.685 12:06:08 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.685 12:06:08 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:21.685 12:06:08 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.685 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.685 12:06:08 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:21.685 12:06:08 -- common/autotest_common.sh@10 -- # set +x 00:06:21.685 [2024-11-02 12:06:08.654409] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:21.685 [2024-11-02 12:06:08.654500] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1130969 ] 00:06:21.943 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.943 [2024-11-02 12:06:08.720804] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:21.943 [2024-11-02 12:06:08.755516] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.943 [2024-11-02 12:06:08.755670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.943 [2024-11-02 12:06:08.755781] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.943 [2024-11-02 12:06:08.755783] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:22.510 12:06:09 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:22.510 12:06:09 -- common/autotest_common.sh@852 -- # return 0 00:06:22.510 12:06:09 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1131137 00:06:22.510 12:06:09 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1131137 /var/tmp/spdk2.sock 00:06:22.510 12:06:09 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:22.510 12:06:09 -- common/autotest_common.sh@640 -- # local es=0 00:06:22.510 12:06:09 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 1131137 /var/tmp/spdk2.sock 00:06:22.510 12:06:09 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:22.783 12:06:09 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:22.783 12:06:09 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:22.783 12:06:09 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:22.783 12:06:09 -- common/autotest_common.sh@643 -- # waitforlisten 1131137 /var/tmp/spdk2.sock 00:06:22.783 12:06:09 -- common/autotest_common.sh@819 -- # '[' -z 1131137 ']' 00:06:22.783 12:06:09 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:22.783 12:06:09 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:22.783 12:06:09 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:22.783 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:22.783 12:06:09 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:22.783 12:06:09 -- common/autotest_common.sh@10 -- # set +x 00:06:22.783 [2024-11-02 12:06:09.505974] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:22.783 [2024-11-02 12:06:09.506080] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1131137 ] 00:06:22.783 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.783 [2024-11-02 12:06:09.599226] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1130969 has claimed it. 00:06:22.783 [2024-11-02 12:06:09.599267] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:23.353 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (1131137) - No such process 00:06:23.353 ERROR: process (pid: 1131137) is no longer running 00:06:23.353 12:06:10 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:23.353 12:06:10 -- common/autotest_common.sh@852 -- # return 1 00:06:23.353 12:06:10 -- common/autotest_common.sh@643 -- # es=1 00:06:23.353 12:06:10 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:23.353 12:06:10 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:23.353 12:06:10 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:23.353 12:06:10 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:23.353 12:06:10 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:23.353 12:06:10 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:23.353 12:06:10 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:23.353 12:06:10 -- event/cpu_locks.sh@141 -- # killprocess 1130969 00:06:23.353 12:06:10 -- common/autotest_common.sh@926 -- # '[' -z 1130969 ']' 00:06:23.353 12:06:10 -- common/autotest_common.sh@930 -- # kill -0 1130969 00:06:23.353 12:06:10 -- common/autotest_common.sh@931 -- # uname 00:06:23.353 12:06:10 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:23.353 12:06:10 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1130969 00:06:23.353 12:06:10 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:23.353 12:06:10 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:23.353 12:06:10 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1130969' 00:06:23.353 killing process with pid 1130969 00:06:23.353 12:06:10 -- common/autotest_common.sh@945 -- # kill 1130969 00:06:23.353 12:06:10 -- common/autotest_common.sh@950 -- # wait 1130969 00:06:23.611 00:06:23.611 real 0m1.889s 00:06:23.611 user 0m5.423s 00:06:23.611 sys 0m0.469s 00:06:23.611 12:06:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.611 12:06:10 -- common/autotest_common.sh@10 -- # set +x 00:06:23.611 ************************************ 00:06:23.612 END TEST locking_overlapped_coremask 00:06:23.612 ************************************ 00:06:23.612 12:06:10 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:23.612 12:06:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:23.612 12:06:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:23.612 12:06:10 -- common/autotest_common.sh@10 -- # set +x 00:06:23.612 ************************************ 00:06:23.612 START TEST locking_overlapped_coremask_via_rpc 00:06:23.612 ************************************ 00:06:23.612 12:06:10 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:06:23.612 12:06:10 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:23.612 12:06:10 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1131285 00:06:23.612 12:06:10 -- event/cpu_locks.sh@149 -- # waitforlisten 1131285 /var/tmp/spdk.sock 00:06:23.612 12:06:10 -- common/autotest_common.sh@819 -- # '[' -z 1131285 ']' 00:06:23.612 12:06:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.612 12:06:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:23.612 12:06:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.612 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.612 12:06:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:23.612 12:06:10 -- common/autotest_common.sh@10 -- # set +x 00:06:23.612 [2024-11-02 12:06:10.582074] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:23.612 [2024-11-02 12:06:10.582144] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1131285 ] 00:06:23.870 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.870 [2024-11-02 12:06:10.646886] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:23.870 [2024-11-02 12:06:10.646912] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:23.870 [2024-11-02 12:06:10.686984] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:23.870 [2024-11-02 12:06:10.687136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.870 [2024-11-02 12:06:10.687221] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:23.870 [2024-11-02 12:06:10.687223] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.804 12:06:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:24.804 12:06:11 -- common/autotest_common.sh@852 -- # return 0 00:06:24.804 12:06:11 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1131550 00:06:24.804 12:06:11 -- event/cpu_locks.sh@153 -- # waitforlisten 1131550 /var/tmp/spdk2.sock 00:06:24.804 12:06:11 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:24.804 12:06:11 -- common/autotest_common.sh@819 -- # '[' -z 1131550 ']' 00:06:24.805 12:06:11 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:24.805 12:06:11 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:24.805 12:06:11 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:24.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:24.805 12:06:11 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:24.805 12:06:11 -- common/autotest_common.sh@10 -- # set +x 00:06:24.805 [2024-11-02 12:06:11.467862] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:24.805 [2024-11-02 12:06:11.467950] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1131550 ] 00:06:24.805 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.805 [2024-11-02 12:06:11.561974] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:24.805 [2024-11-02 12:06:11.562008] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:24.805 [2024-11-02 12:06:11.641048] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:24.805 [2024-11-02 12:06:11.641195] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:24.805 [2024-11-02 12:06:11.641329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:24.805 [2024-11-02 12:06:11.641331] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:25.419 12:06:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.419 12:06:12 -- common/autotest_common.sh@852 -- # return 0 00:06:25.419 12:06:12 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:25.419 12:06:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:25.419 12:06:12 -- common/autotest_common.sh@10 -- # set +x 00:06:25.419 12:06:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:25.419 12:06:12 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:25.419 12:06:12 -- common/autotest_common.sh@640 -- # local es=0 00:06:25.419 12:06:12 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:25.419 12:06:12 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:06:25.419 12:06:12 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:25.419 12:06:12 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:06:25.419 12:06:12 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:25.419 12:06:12 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:25.419 12:06:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:25.419 12:06:12 -- common/autotest_common.sh@10 -- # set +x 00:06:25.419 [2024-11-02 12:06:12.323055] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1131285 has claimed it. 00:06:25.419 request: 00:06:25.419 { 00:06:25.419 "method": "framework_enable_cpumask_locks", 00:06:25.419 "req_id": 1 00:06:25.419 } 00:06:25.419 Got JSON-RPC error response 00:06:25.419 response: 00:06:25.419 { 00:06:25.419 "code": -32603, 00:06:25.419 "message": "Failed to claim CPU core: 2" 00:06:25.419 } 00:06:25.419 12:06:12 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:06:25.419 12:06:12 -- common/autotest_common.sh@643 -- # es=1 00:06:25.419 12:06:12 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:25.419 12:06:12 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:25.419 12:06:12 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:25.419 12:06:12 -- event/cpu_locks.sh@158 -- # waitforlisten 1131285 /var/tmp/spdk.sock 00:06:25.419 12:06:12 -- common/autotest_common.sh@819 -- # '[' -z 1131285 ']' 00:06:25.419 12:06:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.419 12:06:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:25.419 12:06:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.419 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.419 12:06:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:25.419 12:06:12 -- common/autotest_common.sh@10 -- # set +x 00:06:25.729 12:06:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.729 12:06:12 -- common/autotest_common.sh@852 -- # return 0 00:06:25.729 12:06:12 -- event/cpu_locks.sh@159 -- # waitforlisten 1131550 /var/tmp/spdk2.sock 00:06:25.729 12:06:12 -- common/autotest_common.sh@819 -- # '[' -z 1131550 ']' 00:06:25.729 12:06:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:25.729 12:06:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:25.729 12:06:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:25.729 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:25.729 12:06:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:25.729 12:06:12 -- common/autotest_common.sh@10 -- # set +x 00:06:25.987 12:06:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.988 12:06:12 -- common/autotest_common.sh@852 -- # return 0 00:06:25.988 12:06:12 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:25.988 12:06:12 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:25.988 12:06:12 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:25.988 12:06:12 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:25.988 00:06:25.988 real 0m2.151s 00:06:25.988 user 0m0.888s 00:06:25.988 sys 0m0.193s 00:06:25.988 12:06:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:25.988 12:06:12 -- common/autotest_common.sh@10 -- # set +x 00:06:25.988 ************************************ 00:06:25.988 END TEST locking_overlapped_coremask_via_rpc 00:06:25.988 ************************************ 00:06:25.988 12:06:12 -- event/cpu_locks.sh@174 -- # cleanup 00:06:25.988 12:06:12 -- event/cpu_locks.sh@15 -- # [[ -z 1131285 ]] 00:06:25.988 12:06:12 -- event/cpu_locks.sh@15 -- # killprocess 1131285 00:06:25.988 12:06:12 -- common/autotest_common.sh@926 -- # '[' -z 1131285 ']' 00:06:25.988 12:06:12 -- common/autotest_common.sh@930 -- # kill -0 1131285 00:06:25.988 12:06:12 -- common/autotest_common.sh@931 -- # uname 00:06:25.988 12:06:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:25.988 12:06:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1131285 00:06:25.988 12:06:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:25.988 12:06:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:25.988 12:06:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1131285' 00:06:25.988 killing process with pid 1131285 00:06:25.988 12:06:12 -- common/autotest_common.sh@945 -- # kill 1131285 00:06:25.988 12:06:12 -- common/autotest_common.sh@950 -- # wait 1131285 00:06:26.246 12:06:13 -- event/cpu_locks.sh@16 -- # [[ -z 1131550 ]] 00:06:26.246 12:06:13 -- event/cpu_locks.sh@16 -- # killprocess 1131550 00:06:26.246 12:06:13 -- common/autotest_common.sh@926 -- # '[' -z 1131550 ']' 00:06:26.246 12:06:13 -- common/autotest_common.sh@930 -- # kill -0 1131550 00:06:26.246 12:06:13 -- common/autotest_common.sh@931 -- # uname 00:06:26.246 12:06:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:26.246 12:06:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1131550 00:06:26.246 12:06:13 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:26.246 12:06:13 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:26.246 12:06:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1131550' 00:06:26.246 killing process with pid 1131550 00:06:26.246 12:06:13 -- common/autotest_common.sh@945 -- # kill 1131550 00:06:26.246 12:06:13 -- common/autotest_common.sh@950 -- # wait 1131550 00:06:26.504 12:06:13 -- event/cpu_locks.sh@18 -- # rm -f 00:06:26.763 12:06:13 -- event/cpu_locks.sh@1 -- # cleanup 00:06:26.763 12:06:13 -- event/cpu_locks.sh@15 -- # [[ -z 1131285 ]] 00:06:26.763 12:06:13 -- event/cpu_locks.sh@15 -- # killprocess 1131285 00:06:26.763 12:06:13 -- common/autotest_common.sh@926 -- # '[' -z 1131285 ']' 00:06:26.763 12:06:13 -- common/autotest_common.sh@930 -- # kill -0 1131285 00:06:26.763 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (1131285) - No such process 00:06:26.763 12:06:13 -- common/autotest_common.sh@953 -- # echo 'Process with pid 1131285 is not found' 00:06:26.763 Process with pid 1131285 is not found 00:06:26.763 12:06:13 -- event/cpu_locks.sh@16 -- # [[ -z 1131550 ]] 00:06:26.763 12:06:13 -- event/cpu_locks.sh@16 -- # killprocess 1131550 00:06:26.763 12:06:13 -- common/autotest_common.sh@926 -- # '[' -z 1131550 ']' 00:06:26.763 12:06:13 -- common/autotest_common.sh@930 -- # kill -0 1131550 00:06:26.763 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (1131550) - No such process 00:06:26.763 12:06:13 -- common/autotest_common.sh@953 -- # echo 'Process with pid 1131550 is not found' 00:06:26.763 Process with pid 1131550 is not found 00:06:26.763 12:06:13 -- event/cpu_locks.sh@18 -- # rm -f 00:06:26.763 00:06:26.763 real 0m17.970s 00:06:26.763 user 0m31.023s 00:06:26.763 sys 0m5.607s 00:06:26.763 12:06:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.763 12:06:13 -- common/autotest_common.sh@10 -- # set +x 00:06:26.763 ************************************ 00:06:26.763 END TEST cpu_locks 00:06:26.763 ************************************ 00:06:26.763 00:06:26.763 real 0m42.754s 00:06:26.763 user 1m20.478s 00:06:26.763 sys 0m9.677s 00:06:26.763 12:06:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.763 12:06:13 -- common/autotest_common.sh@10 -- # set +x 00:06:26.763 ************************************ 00:06:26.763 END TEST event 00:06:26.763 ************************************ 00:06:26.763 12:06:13 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:26.763 12:06:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:26.763 12:06:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:26.763 12:06:13 -- common/autotest_common.sh@10 -- # set +x 00:06:26.763 ************************************ 00:06:26.763 START TEST thread 00:06:26.763 ************************************ 00:06:26.763 12:06:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:26.763 * Looking for test storage... 00:06:26.763 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:26.763 12:06:13 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:26.763 12:06:13 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:26.763 12:06:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:26.763 12:06:13 -- common/autotest_common.sh@10 -- # set +x 00:06:26.763 ************************************ 00:06:26.763 START TEST thread_poller_perf 00:06:26.763 ************************************ 00:06:26.763 12:06:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:26.763 [2024-11-02 12:06:13.704299] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:26.763 [2024-11-02 12:06:13.704398] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1131924 ] 00:06:27.022 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.022 [2024-11-02 12:06:13.772458] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.022 [2024-11-02 12:06:13.808768] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.022 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:27.957 [2024-11-02T11:06:14.933Z] ====================================== 00:06:27.957 [2024-11-02T11:06:14.933Z] busy:2506528624 (cyc) 00:06:27.957 [2024-11-02T11:06:14.933Z] total_run_count: 783000 00:06:27.957 [2024-11-02T11:06:14.933Z] tsc_hz: 2500000000 (cyc) 00:06:27.957 [2024-11-02T11:06:14.933Z] ====================================== 00:06:27.957 [2024-11-02T11:06:14.933Z] poller_cost: 3201 (cyc), 1280 (nsec) 00:06:27.957 00:06:27.957 real 0m1.182s 00:06:27.957 user 0m1.091s 00:06:27.957 sys 0m0.086s 00:06:27.957 12:06:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.957 12:06:14 -- common/autotest_common.sh@10 -- # set +x 00:06:27.957 ************************************ 00:06:27.957 END TEST thread_poller_perf 00:06:27.957 ************************************ 00:06:27.957 12:06:14 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:27.957 12:06:14 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:27.957 12:06:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:27.957 12:06:14 -- common/autotest_common.sh@10 -- # set +x 00:06:27.957 ************************************ 00:06:27.957 START TEST thread_poller_perf 00:06:27.957 ************************************ 00:06:27.957 12:06:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:28.216 [2024-11-02 12:06:14.936723] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:28.216 [2024-11-02 12:06:14.936817] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132206 ] 00:06:28.216 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.216 [2024-11-02 12:06:15.003918] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.216 [2024-11-02 12:06:15.038800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.216 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:29.150 [2024-11-02T11:06:16.126Z] ====================================== 00:06:29.150 [2024-11-02T11:06:16.126Z] busy:2501893598 (cyc) 00:06:29.150 [2024-11-02T11:06:16.126Z] total_run_count: 13035000 00:06:29.150 [2024-11-02T11:06:16.126Z] tsc_hz: 2500000000 (cyc) 00:06:29.150 [2024-11-02T11:06:16.126Z] ====================================== 00:06:29.150 [2024-11-02T11:06:16.126Z] poller_cost: 191 (cyc), 76 (nsec) 00:06:29.150 00:06:29.150 real 0m1.173s 00:06:29.150 user 0m1.080s 00:06:29.150 sys 0m0.088s 00:06:29.150 12:06:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.150 12:06:16 -- common/autotest_common.sh@10 -- # set +x 00:06:29.150 ************************************ 00:06:29.150 END TEST thread_poller_perf 00:06:29.150 ************************************ 00:06:29.409 12:06:16 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:29.409 12:06:16 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:29.409 12:06:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:29.409 12:06:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:29.409 12:06:16 -- common/autotest_common.sh@10 -- # set +x 00:06:29.409 ************************************ 00:06:29.409 START TEST thread_spdk_lock 00:06:29.409 ************************************ 00:06:29.409 12:06:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:29.409 [2024-11-02 12:06:16.159914] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:29.409 [2024-11-02 12:06:16.160024] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132496 ] 00:06:29.409 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.409 [2024-11-02 12:06:16.228247] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:29.409 [2024-11-02 12:06:16.265075] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:29.409 [2024-11-02 12:06:16.265077] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.976 [2024-11-02 12:06:16.751834] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.976 [2024-11-02 12:06:16.751873] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:29.976 [2024-11-02 12:06:16.751884] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x12e2e40 00:06:29.976 [2024-11-02 12:06:16.752739] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.976 [2024-11-02 12:06:16.752844] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.976 [2024-11-02 12:06:16.752867] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.976 Starting test contend 00:06:29.976 Worker Delay Wait us Hold us Total us 00:06:29.976 0 3 165034 181963 346998 00:06:29.976 1 5 85959 283967 369926 00:06:29.976 PASS test contend 00:06:29.976 Starting test hold_by_poller 00:06:29.976 PASS test hold_by_poller 00:06:29.976 Starting test hold_by_message 00:06:29.976 PASS test hold_by_message 00:06:29.976 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:29.976 100014 assertions passed 00:06:29.976 0 assertions failed 00:06:29.976 00:06:29.976 real 0m0.661s 00:06:29.976 user 0m1.062s 00:06:29.976 sys 0m0.083s 00:06:29.976 12:06:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.976 12:06:16 -- common/autotest_common.sh@10 -- # set +x 00:06:29.976 ************************************ 00:06:29.976 END TEST thread_spdk_lock 00:06:29.976 ************************************ 00:06:29.976 00:06:29.976 real 0m3.268s 00:06:29.976 user 0m3.324s 00:06:29.976 sys 0m0.454s 00:06:29.976 12:06:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.976 12:06:16 -- common/autotest_common.sh@10 -- # set +x 00:06:29.976 ************************************ 00:06:29.976 END TEST thread 00:06:29.976 ************************************ 00:06:29.976 12:06:16 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:29.976 12:06:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:29.976 12:06:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:29.976 12:06:16 -- common/autotest_common.sh@10 -- # set +x 00:06:29.976 ************************************ 00:06:29.976 START TEST accel 00:06:29.976 ************************************ 00:06:29.976 12:06:16 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:30.235 * Looking for test storage... 00:06:30.235 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:30.235 12:06:16 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:30.235 12:06:16 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:30.235 12:06:16 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:30.235 12:06:16 -- accel/accel.sh@59 -- # spdk_tgt_pid=1132593 00:06:30.235 12:06:16 -- accel/accel.sh@60 -- # waitforlisten 1132593 00:06:30.235 12:06:16 -- common/autotest_common.sh@819 -- # '[' -z 1132593 ']' 00:06:30.235 12:06:16 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.235 12:06:16 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:30.235 12:06:16 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:30.235 12:06:16 -- accel/accel.sh@58 -- # build_accel_config 00:06:30.235 12:06:16 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.235 12:06:16 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:30.235 12:06:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:30.235 12:06:16 -- common/autotest_common.sh@10 -- # set +x 00:06:30.235 12:06:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.235 12:06:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.235 12:06:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:30.235 12:06:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:30.235 12:06:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:30.235 12:06:16 -- accel/accel.sh@42 -- # jq -r . 00:06:30.235 [2024-11-02 12:06:17.016270] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:30.235 [2024-11-02 12:06:17.016368] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132593 ] 00:06:30.235 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.235 [2024-11-02 12:06:17.083146] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.235 [2024-11-02 12:06:17.119477] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.235 [2024-11-02 12:06:17.119597] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.169 12:06:17 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:31.169 12:06:17 -- common/autotest_common.sh@852 -- # return 0 00:06:31.169 12:06:17 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:31.169 12:06:17 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:31.169 12:06:17 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:31.169 12:06:17 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:31.169 12:06:17 -- common/autotest_common.sh@10 -- # set +x 00:06:31.169 12:06:17 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # IFS== 00:06:31.169 12:06:17 -- accel/accel.sh@64 -- # read -r opc module 00:06:31.169 12:06:17 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:31.169 12:06:17 -- accel/accel.sh@67 -- # killprocess 1132593 00:06:31.169 12:06:17 -- common/autotest_common.sh@926 -- # '[' -z 1132593 ']' 00:06:31.169 12:06:17 -- common/autotest_common.sh@930 -- # kill -0 1132593 00:06:31.169 12:06:17 -- common/autotest_common.sh@931 -- # uname 00:06:31.169 12:06:17 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:31.169 12:06:17 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1132593 00:06:31.169 12:06:17 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:31.169 12:06:17 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:31.169 12:06:17 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1132593' 00:06:31.169 killing process with pid 1132593 00:06:31.169 12:06:17 -- common/autotest_common.sh@945 -- # kill 1132593 00:06:31.169 12:06:17 -- common/autotest_common.sh@950 -- # wait 1132593 00:06:31.428 12:06:18 -- accel/accel.sh@68 -- # trap - ERR 00:06:31.428 12:06:18 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:31.428 12:06:18 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:31.428 12:06:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.428 12:06:18 -- common/autotest_common.sh@10 -- # set +x 00:06:31.428 12:06:18 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:31.428 12:06:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:31.428 12:06:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.428 12:06:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.428 12:06:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.428 12:06:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.428 12:06:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.428 12:06:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.428 12:06:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.428 12:06:18 -- accel/accel.sh@42 -- # jq -r . 00:06:31.428 12:06:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.428 12:06:18 -- common/autotest_common.sh@10 -- # set +x 00:06:31.428 12:06:18 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:31.428 12:06:18 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:31.428 12:06:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.428 12:06:18 -- common/autotest_common.sh@10 -- # set +x 00:06:31.428 ************************************ 00:06:31.428 START TEST accel_missing_filename 00:06:31.428 ************************************ 00:06:31.428 12:06:18 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:31.428 12:06:18 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.428 12:06:18 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:31.428 12:06:18 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:31.428 12:06:18 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.428 12:06:18 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:31.428 12:06:18 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.428 12:06:18 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:31.428 12:06:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:31.428 12:06:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.428 12:06:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.428 12:06:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.428 12:06:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.428 12:06:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.428 12:06:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.428 12:06:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.428 12:06:18 -- accel/accel.sh@42 -- # jq -r . 00:06:31.428 [2024-11-02 12:06:18.347987] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:31.428 [2024-11-02 12:06:18.348097] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132870 ] 00:06:31.428 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.686 [2024-11-02 12:06:18.417034] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.687 [2024-11-02 12:06:18.455578] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.687 [2024-11-02 12:06:18.495661] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:31.687 [2024-11-02 12:06:18.555827] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:31.687 A filename is required. 00:06:31.687 12:06:18 -- common/autotest_common.sh@643 -- # es=234 00:06:31.687 12:06:18 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:31.687 12:06:18 -- common/autotest_common.sh@652 -- # es=106 00:06:31.687 12:06:18 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:31.687 12:06:18 -- common/autotest_common.sh@660 -- # es=1 00:06:31.687 12:06:18 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:31.687 00:06:31.687 real 0m0.290s 00:06:31.687 user 0m0.195s 00:06:31.687 sys 0m0.133s 00:06:31.687 12:06:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.687 12:06:18 -- common/autotest_common.sh@10 -- # set +x 00:06:31.687 ************************************ 00:06:31.687 END TEST accel_missing_filename 00:06:31.687 ************************************ 00:06:31.687 12:06:18 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.687 12:06:18 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:31.687 12:06:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:31.687 12:06:18 -- common/autotest_common.sh@10 -- # set +x 00:06:31.944 ************************************ 00:06:31.944 START TEST accel_compress_verify 00:06:31.944 ************************************ 00:06:31.944 12:06:18 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.944 12:06:18 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.945 12:06:18 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.945 12:06:18 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:31.945 12:06:18 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.945 12:06:18 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:31.945 12:06:18 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.945 12:06:18 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.945 12:06:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.945 12:06:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.945 12:06:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.945 12:06:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.945 12:06:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.945 12:06:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.945 12:06:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.945 12:06:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.945 12:06:18 -- accel/accel.sh@42 -- # jq -r . 00:06:31.945 [2024-11-02 12:06:18.687052] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:31.945 [2024-11-02 12:06:18.687145] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1132987 ] 00:06:31.945 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.945 [2024-11-02 12:06:18.756015] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.945 [2024-11-02 12:06:18.791709] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.945 [2024-11-02 12:06:18.831519] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:31.945 [2024-11-02 12:06:18.890949] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:32.204 00:06:32.204 Compression does not support the verify option, aborting. 00:06:32.204 12:06:18 -- common/autotest_common.sh@643 -- # es=161 00:06:32.204 12:06:18 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:32.204 12:06:18 -- common/autotest_common.sh@652 -- # es=33 00:06:32.204 12:06:18 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:32.204 12:06:18 -- common/autotest_common.sh@660 -- # es=1 00:06:32.204 12:06:18 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:32.204 00:06:32.204 real 0m0.286s 00:06:32.204 user 0m0.196s 00:06:32.204 sys 0m0.130s 00:06:32.204 12:06:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.204 12:06:18 -- common/autotest_common.sh@10 -- # set +x 00:06:32.204 ************************************ 00:06:32.204 END TEST accel_compress_verify 00:06:32.204 ************************************ 00:06:32.204 12:06:18 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:32.204 12:06:18 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:32.204 12:06:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.204 12:06:18 -- common/autotest_common.sh@10 -- # set +x 00:06:32.204 ************************************ 00:06:32.204 START TEST accel_wrong_workload 00:06:32.204 ************************************ 00:06:32.204 12:06:18 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:32.204 12:06:18 -- common/autotest_common.sh@640 -- # local es=0 00:06:32.204 12:06:18 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:32.204 12:06:18 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:32.204 12:06:18 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:32.204 12:06:18 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:32.204 12:06:18 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:32.204 12:06:18 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:32.204 12:06:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:32.204 12:06:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.204 12:06:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.204 12:06:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.204 12:06:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.204 12:06:19 -- accel/accel.sh@42 -- # jq -r . 00:06:32.204 Unsupported workload type: foobar 00:06:32.204 [2024-11-02 12:06:19.018129] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:32.204 accel_perf options: 00:06:32.204 [-h help message] 00:06:32.204 [-q queue depth per core] 00:06:32.204 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:32.204 [-T number of threads per core 00:06:32.204 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:32.204 [-t time in seconds] 00:06:32.204 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:32.204 [ dif_verify, , dif_generate, dif_generate_copy 00:06:32.204 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:32.204 [-l for compress/decompress workloads, name of uncompressed input file 00:06:32.204 [-S for crc32c workload, use this seed value (default 0) 00:06:32.204 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:32.204 [-f for fill workload, use this BYTE value (default 255) 00:06:32.204 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:32.204 [-y verify result if this switch is on] 00:06:32.204 [-a tasks to allocate per core (default: same value as -q)] 00:06:32.204 Can be used to spread operations across a wider range of memory. 00:06:32.204 12:06:19 -- common/autotest_common.sh@643 -- # es=1 00:06:32.204 12:06:19 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:32.204 12:06:19 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:32.204 12:06:19 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:32.204 00:06:32.204 real 0m0.029s 00:06:32.204 user 0m0.012s 00:06:32.204 sys 0m0.017s 00:06:32.204 12:06:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.204 12:06:19 -- common/autotest_common.sh@10 -- # set +x 00:06:32.204 ************************************ 00:06:32.204 END TEST accel_wrong_workload 00:06:32.204 ************************************ 00:06:32.204 Error: writing output failed: Broken pipe 00:06:32.204 12:06:19 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:32.204 12:06:19 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:32.204 12:06:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.204 12:06:19 -- common/autotest_common.sh@10 -- # set +x 00:06:32.204 ************************************ 00:06:32.204 START TEST accel_negative_buffers 00:06:32.204 ************************************ 00:06:32.204 12:06:19 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:32.204 12:06:19 -- common/autotest_common.sh@640 -- # local es=0 00:06:32.204 12:06:19 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:32.204 12:06:19 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:32.204 12:06:19 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:32.204 12:06:19 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:32.204 12:06:19 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:32.204 12:06:19 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:32.204 12:06:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:32.204 12:06:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.204 12:06:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.204 12:06:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.204 12:06:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.204 12:06:19 -- accel/accel.sh@42 -- # jq -r . 00:06:32.204 -x option must be non-negative. 00:06:32.204 [2024-11-02 12:06:19.091495] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:32.204 accel_perf options: 00:06:32.204 [-h help message] 00:06:32.204 [-q queue depth per core] 00:06:32.204 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:32.204 [-T number of threads per core 00:06:32.204 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:32.204 [-t time in seconds] 00:06:32.204 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:32.204 [ dif_verify, , dif_generate, dif_generate_copy 00:06:32.204 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:32.204 [-l for compress/decompress workloads, name of uncompressed input file 00:06:32.204 [-S for crc32c workload, use this seed value (default 0) 00:06:32.204 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:32.204 [-f for fill workload, use this BYTE value (default 255) 00:06:32.204 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:32.204 [-y verify result if this switch is on] 00:06:32.204 [-a tasks to allocate per core (default: same value as -q)] 00:06:32.204 Can be used to spread operations across a wider range of memory. 00:06:32.204 12:06:19 -- common/autotest_common.sh@643 -- # es=1 00:06:32.204 12:06:19 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:32.204 12:06:19 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:32.204 12:06:19 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:32.204 00:06:32.204 real 0m0.027s 00:06:32.204 user 0m0.010s 00:06:32.204 sys 0m0.016s 00:06:32.204 12:06:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.204 12:06:19 -- common/autotest_common.sh@10 -- # set +x 00:06:32.204 ************************************ 00:06:32.204 END TEST accel_negative_buffers 00:06:32.204 ************************************ 00:06:32.204 Error: writing output failed: Broken pipe 00:06:32.204 12:06:19 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:32.204 12:06:19 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:32.204 12:06:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:32.204 12:06:19 -- common/autotest_common.sh@10 -- # set +x 00:06:32.204 ************************************ 00:06:32.204 START TEST accel_crc32c 00:06:32.204 ************************************ 00:06:32.204 12:06:19 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:32.204 12:06:19 -- accel/accel.sh@16 -- # local accel_opc 00:06:32.204 12:06:19 -- accel/accel.sh@17 -- # local accel_module 00:06:32.204 12:06:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:32.204 12:06:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:32.204 12:06:19 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.204 12:06:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.204 12:06:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.204 12:06:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.204 12:06:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.204 12:06:19 -- accel/accel.sh@42 -- # jq -r . 00:06:32.204 [2024-11-02 12:06:19.167091] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:32.204 [2024-11-02 12:06:19.167183] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133205 ] 00:06:32.463 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.463 [2024-11-02 12:06:19.237870] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.463 [2024-11-02 12:06:19.279987] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.835 12:06:20 -- accel/accel.sh@18 -- # out=' 00:06:33.835 SPDK Configuration: 00:06:33.835 Core mask: 0x1 00:06:33.835 00:06:33.835 Accel Perf Configuration: 00:06:33.835 Workload Type: crc32c 00:06:33.835 CRC-32C seed: 32 00:06:33.835 Transfer size: 4096 bytes 00:06:33.835 Vector count 1 00:06:33.835 Module: software 00:06:33.835 Queue depth: 32 00:06:33.835 Allocate depth: 32 00:06:33.835 # threads/core: 1 00:06:33.835 Run time: 1 seconds 00:06:33.835 Verify: Yes 00:06:33.835 00:06:33.835 Running for 1 seconds... 00:06:33.835 00:06:33.835 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:33.835 ------------------------------------------------------------------------------------ 00:06:33.835 0,0 822976/s 3214 MiB/s 0 0 00:06:33.835 ==================================================================================== 00:06:33.835 Total 822976/s 3214 MiB/s 0 0' 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:33.835 12:06:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:33.835 12:06:20 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.835 12:06:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.835 12:06:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.835 12:06:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.835 12:06:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.835 12:06:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.835 12:06:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.835 12:06:20 -- accel/accel.sh@42 -- # jq -r . 00:06:33.835 [2024-11-02 12:06:20.465699] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:33.835 [2024-11-02 12:06:20.465797] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133421 ] 00:06:33.835 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.835 [2024-11-02 12:06:20.533495] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.835 [2024-11-02 12:06:20.568632] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val= 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val= 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val=0x1 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val= 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val= 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val=crc32c 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val=32 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val= 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.835 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.835 12:06:20 -- accel/accel.sh@21 -- # val=software 00:06:33.835 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.835 12:06:20 -- accel/accel.sh@23 -- # accel_module=software 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.836 12:06:20 -- accel/accel.sh@21 -- # val=32 00:06:33.836 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.836 12:06:20 -- accel/accel.sh@21 -- # val=32 00:06:33.836 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.836 12:06:20 -- accel/accel.sh@21 -- # val=1 00:06:33.836 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.836 12:06:20 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:33.836 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.836 12:06:20 -- accel/accel.sh@21 -- # val=Yes 00:06:33.836 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.836 12:06:20 -- accel/accel.sh@21 -- # val= 00:06:33.836 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:33.836 12:06:20 -- accel/accel.sh@21 -- # val= 00:06:33.836 12:06:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # IFS=: 00:06:33.836 12:06:20 -- accel/accel.sh@20 -- # read -r var val 00:06:34.771 12:06:21 -- accel/accel.sh@21 -- # val= 00:06:34.771 12:06:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # IFS=: 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # read -r var val 00:06:34.771 12:06:21 -- accel/accel.sh@21 -- # val= 00:06:34.771 12:06:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # IFS=: 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # read -r var val 00:06:34.771 12:06:21 -- accel/accel.sh@21 -- # val= 00:06:34.771 12:06:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # IFS=: 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # read -r var val 00:06:34.771 12:06:21 -- accel/accel.sh@21 -- # val= 00:06:34.771 12:06:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # IFS=: 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # read -r var val 00:06:34.771 12:06:21 -- accel/accel.sh@21 -- # val= 00:06:34.771 12:06:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # IFS=: 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # read -r var val 00:06:34.771 12:06:21 -- accel/accel.sh@21 -- # val= 00:06:34.771 12:06:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # IFS=: 00:06:34.771 12:06:21 -- accel/accel.sh@20 -- # read -r var val 00:06:34.771 12:06:21 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:34.771 12:06:21 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:34.771 12:06:21 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:34.771 00:06:34.771 real 0m2.591s 00:06:34.771 user 0m2.334s 00:06:34.771 sys 0m0.265s 00:06:34.771 12:06:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.771 12:06:21 -- common/autotest_common.sh@10 -- # set +x 00:06:34.771 ************************************ 00:06:34.771 END TEST accel_crc32c 00:06:34.771 ************************************ 00:06:35.029 12:06:21 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:35.029 12:06:21 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:35.029 12:06:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:35.029 12:06:21 -- common/autotest_common.sh@10 -- # set +x 00:06:35.029 ************************************ 00:06:35.029 START TEST accel_crc32c_C2 00:06:35.029 ************************************ 00:06:35.029 12:06:21 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:35.029 12:06:21 -- accel/accel.sh@16 -- # local accel_opc 00:06:35.029 12:06:21 -- accel/accel.sh@17 -- # local accel_module 00:06:35.029 12:06:21 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:35.029 12:06:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:35.029 12:06:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.029 12:06:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.029 12:06:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.029 12:06:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.029 12:06:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.029 12:06:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.029 12:06:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.029 12:06:21 -- accel/accel.sh@42 -- # jq -r . 00:06:35.029 [2024-11-02 12:06:21.805639] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:35.029 [2024-11-02 12:06:21.805748] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133604 ] 00:06:35.029 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.029 [2024-11-02 12:06:21.873210] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.029 [2024-11-02 12:06:21.909036] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.403 12:06:23 -- accel/accel.sh@18 -- # out=' 00:06:36.403 SPDK Configuration: 00:06:36.403 Core mask: 0x1 00:06:36.403 00:06:36.403 Accel Perf Configuration: 00:06:36.403 Workload Type: crc32c 00:06:36.403 CRC-32C seed: 0 00:06:36.403 Transfer size: 4096 bytes 00:06:36.403 Vector count 2 00:06:36.403 Module: software 00:06:36.403 Queue depth: 32 00:06:36.403 Allocate depth: 32 00:06:36.403 # threads/core: 1 00:06:36.403 Run time: 1 seconds 00:06:36.403 Verify: Yes 00:06:36.403 00:06:36.403 Running for 1 seconds... 00:06:36.403 00:06:36.403 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:36.403 ------------------------------------------------------------------------------------ 00:06:36.403 0,0 614528/s 4801 MiB/s 0 0 00:06:36.403 ==================================================================================== 00:06:36.403 Total 614528/s 2400 MiB/s 0 0' 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:36.403 12:06:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:36.403 12:06:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.403 12:06:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.403 12:06:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.403 12:06:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.403 12:06:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.403 12:06:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.403 12:06:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.403 12:06:23 -- accel/accel.sh@42 -- # jq -r . 00:06:36.403 [2024-11-02 12:06:23.089611] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:36.403 [2024-11-02 12:06:23.089724] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1133773 ] 00:06:36.403 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.403 [2024-11-02 12:06:23.158491] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.403 [2024-11-02 12:06:23.193233] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val= 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val= 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=0x1 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val= 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val= 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=crc32c 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=0 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val= 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=software 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=32 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=32 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=1 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val=Yes 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val= 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:36.403 12:06:23 -- accel/accel.sh@21 -- # val= 00:06:36.403 12:06:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # IFS=: 00:06:36.403 12:06:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.777 12:06:24 -- accel/accel.sh@21 -- # val= 00:06:37.777 12:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:37.777 12:06:24 -- accel/accel.sh@21 -- # val= 00:06:37.777 12:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:37.777 12:06:24 -- accel/accel.sh@21 -- # val= 00:06:37.777 12:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:37.777 12:06:24 -- accel/accel.sh@21 -- # val= 00:06:37.777 12:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:37.777 12:06:24 -- accel/accel.sh@21 -- # val= 00:06:37.777 12:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:37.777 12:06:24 -- accel/accel.sh@21 -- # val= 00:06:37.777 12:06:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # IFS=: 00:06:37.777 12:06:24 -- accel/accel.sh@20 -- # read -r var val 00:06:37.777 12:06:24 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:37.777 12:06:24 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:37.777 12:06:24 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:37.777 00:06:37.777 real 0m2.576s 00:06:37.777 user 0m2.340s 00:06:37.777 sys 0m0.244s 00:06:37.777 12:06:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.777 12:06:24 -- common/autotest_common.sh@10 -- # set +x 00:06:37.777 ************************************ 00:06:37.777 END TEST accel_crc32c_C2 00:06:37.777 ************************************ 00:06:37.777 12:06:24 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:37.777 12:06:24 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:37.777 12:06:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:37.777 12:06:24 -- common/autotest_common.sh@10 -- # set +x 00:06:37.777 ************************************ 00:06:37.777 START TEST accel_copy 00:06:37.777 ************************************ 00:06:37.777 12:06:24 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:37.777 12:06:24 -- accel/accel.sh@16 -- # local accel_opc 00:06:37.777 12:06:24 -- accel/accel.sh@17 -- # local accel_module 00:06:37.777 12:06:24 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:37.777 12:06:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:37.777 12:06:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.777 12:06:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.777 12:06:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.777 12:06:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.777 12:06:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.777 12:06:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.777 12:06:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.777 12:06:24 -- accel/accel.sh@42 -- # jq -r . 00:06:37.777 [2024-11-02 12:06:24.430443] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:37.777 [2024-11-02 12:06:24.430539] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1134066 ] 00:06:37.777 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.777 [2024-11-02 12:06:24.497566] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.777 [2024-11-02 12:06:24.533063] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.152 12:06:25 -- accel/accel.sh@18 -- # out=' 00:06:39.152 SPDK Configuration: 00:06:39.152 Core mask: 0x1 00:06:39.152 00:06:39.152 Accel Perf Configuration: 00:06:39.152 Workload Type: copy 00:06:39.152 Transfer size: 4096 bytes 00:06:39.152 Vector count 1 00:06:39.152 Module: software 00:06:39.152 Queue depth: 32 00:06:39.152 Allocate depth: 32 00:06:39.152 # threads/core: 1 00:06:39.152 Run time: 1 seconds 00:06:39.152 Verify: Yes 00:06:39.152 00:06:39.152 Running for 1 seconds... 00:06:39.152 00:06:39.152 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:39.152 ------------------------------------------------------------------------------------ 00:06:39.152 0,0 546912/s 2136 MiB/s 0 0 00:06:39.152 ==================================================================================== 00:06:39.152 Total 546912/s 2136 MiB/s 0 0' 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:39.152 12:06:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:39.152 12:06:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.152 12:06:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.152 12:06:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.152 12:06:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.152 12:06:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.152 12:06:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.152 12:06:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.152 12:06:25 -- accel/accel.sh@42 -- # jq -r . 00:06:39.152 [2024-11-02 12:06:25.713831] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:39.152 [2024-11-02 12:06:25.713926] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1134333 ] 00:06:39.152 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.152 [2024-11-02 12:06:25.780617] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.152 [2024-11-02 12:06:25.814904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val= 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val= 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val=0x1 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val= 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val= 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val=copy 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val= 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val=software 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@23 -- # accel_module=software 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val=32 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val=32 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val=1 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val=Yes 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val= 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:39.152 12:06:25 -- accel/accel.sh@21 -- # val= 00:06:39.152 12:06:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # IFS=: 00:06:39.152 12:06:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.086 12:06:26 -- accel/accel.sh@21 -- # val= 00:06:40.086 12:06:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # IFS=: 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # read -r var val 00:06:40.086 12:06:26 -- accel/accel.sh@21 -- # val= 00:06:40.086 12:06:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # IFS=: 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # read -r var val 00:06:40.086 12:06:26 -- accel/accel.sh@21 -- # val= 00:06:40.086 12:06:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # IFS=: 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # read -r var val 00:06:40.086 12:06:26 -- accel/accel.sh@21 -- # val= 00:06:40.086 12:06:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # IFS=: 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # read -r var val 00:06:40.086 12:06:26 -- accel/accel.sh@21 -- # val= 00:06:40.086 12:06:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # IFS=: 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # read -r var val 00:06:40.086 12:06:26 -- accel/accel.sh@21 -- # val= 00:06:40.086 12:06:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # IFS=: 00:06:40.086 12:06:26 -- accel/accel.sh@20 -- # read -r var val 00:06:40.086 12:06:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.086 12:06:26 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:40.086 12:06:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.086 00:06:40.086 real 0m2.572s 00:06:40.086 user 0m2.330s 00:06:40.086 sys 0m0.251s 00:06:40.086 12:06:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.086 12:06:26 -- common/autotest_common.sh@10 -- # set +x 00:06:40.086 ************************************ 00:06:40.086 END TEST accel_copy 00:06:40.086 ************************************ 00:06:40.086 12:06:27 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.086 12:06:27 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:40.086 12:06:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:40.086 12:06:27 -- common/autotest_common.sh@10 -- # set +x 00:06:40.086 ************************************ 00:06:40.086 START TEST accel_fill 00:06:40.086 ************************************ 00:06:40.086 12:06:27 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.086 12:06:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.086 12:06:27 -- accel/accel.sh@17 -- # local accel_module 00:06:40.086 12:06:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.086 12:06:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.086 12:06:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.086 12:06:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.086 12:06:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.086 12:06:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.086 12:06:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.086 12:06:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.086 12:06:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.086 12:06:27 -- accel/accel.sh@42 -- # jq -r . 00:06:40.086 [2024-11-02 12:06:27.049580] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:40.086 [2024-11-02 12:06:27.049676] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1134614 ] 00:06:40.345 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.345 [2024-11-02 12:06:27.116781] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.345 [2024-11-02 12:06:27.152418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.720 12:06:28 -- accel/accel.sh@18 -- # out=' 00:06:41.720 SPDK Configuration: 00:06:41.720 Core mask: 0x1 00:06:41.720 00:06:41.720 Accel Perf Configuration: 00:06:41.720 Workload Type: fill 00:06:41.720 Fill pattern: 0x80 00:06:41.720 Transfer size: 4096 bytes 00:06:41.720 Vector count 1 00:06:41.720 Module: software 00:06:41.720 Queue depth: 64 00:06:41.720 Allocate depth: 64 00:06:41.720 # threads/core: 1 00:06:41.720 Run time: 1 seconds 00:06:41.720 Verify: Yes 00:06:41.720 00:06:41.720 Running for 1 seconds... 00:06:41.720 00:06:41.720 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:41.720 ------------------------------------------------------------------------------------ 00:06:41.720 0,0 963776/s 3764 MiB/s 0 0 00:06:41.720 ==================================================================================== 00:06:41.720 Total 963776/s 3764 MiB/s 0 0' 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.720 12:06:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:41.720 12:06:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:41.720 12:06:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.720 12:06:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.720 12:06:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.720 12:06:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.720 12:06:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.720 12:06:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.720 12:06:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.720 12:06:28 -- accel/accel.sh@42 -- # jq -r . 00:06:41.720 [2024-11-02 12:06:28.333965] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:41.720 [2024-11-02 12:06:28.334089] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1134883 ] 00:06:41.720 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.720 [2024-11-02 12:06:28.401192] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.720 [2024-11-02 12:06:28.435371] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.720 12:06:28 -- accel/accel.sh@21 -- # val= 00:06:41.720 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.720 12:06:28 -- accel/accel.sh@21 -- # val= 00:06:41.720 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.720 12:06:28 -- accel/accel.sh@21 -- # val=0x1 00:06:41.720 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.720 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.720 12:06:28 -- accel/accel.sh@21 -- # val= 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val= 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val=fill 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val=0x80 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val= 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val=software 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@23 -- # accel_module=software 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val=64 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val=64 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val=1 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val=Yes 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val= 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:41.721 12:06:28 -- accel/accel.sh@21 -- # val= 00:06:41.721 12:06:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # IFS=: 00:06:41.721 12:06:28 -- accel/accel.sh@20 -- # read -r var val 00:06:42.655 12:06:29 -- accel/accel.sh@21 -- # val= 00:06:42.655 12:06:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # IFS=: 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # read -r var val 00:06:42.655 12:06:29 -- accel/accel.sh@21 -- # val= 00:06:42.655 12:06:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # IFS=: 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # read -r var val 00:06:42.655 12:06:29 -- accel/accel.sh@21 -- # val= 00:06:42.655 12:06:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # IFS=: 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # read -r var val 00:06:42.655 12:06:29 -- accel/accel.sh@21 -- # val= 00:06:42.655 12:06:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # IFS=: 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # read -r var val 00:06:42.655 12:06:29 -- accel/accel.sh@21 -- # val= 00:06:42.655 12:06:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # IFS=: 00:06:42.655 12:06:29 -- accel/accel.sh@20 -- # read -r var val 00:06:42.655 12:06:29 -- accel/accel.sh@21 -- # val= 00:06:42.656 12:06:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.656 12:06:29 -- accel/accel.sh@20 -- # IFS=: 00:06:42.656 12:06:29 -- accel/accel.sh@20 -- # read -r var val 00:06:42.656 12:06:29 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:42.656 12:06:29 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:42.656 12:06:29 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.656 00:06:42.656 real 0m2.574s 00:06:42.656 user 0m2.323s 00:06:42.656 sys 0m0.259s 00:06:42.656 12:06:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.656 12:06:29 -- common/autotest_common.sh@10 -- # set +x 00:06:42.656 ************************************ 00:06:42.656 END TEST accel_fill 00:06:42.656 ************************************ 00:06:42.924 12:06:29 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:42.924 12:06:29 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:42.924 12:06:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:42.924 12:06:29 -- common/autotest_common.sh@10 -- # set +x 00:06:42.924 ************************************ 00:06:42.924 START TEST accel_copy_crc32c 00:06:42.924 ************************************ 00:06:42.924 12:06:29 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:42.924 12:06:29 -- accel/accel.sh@16 -- # local accel_opc 00:06:42.924 12:06:29 -- accel/accel.sh@17 -- # local accel_module 00:06:42.924 12:06:29 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:42.924 12:06:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:42.924 12:06:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.924 12:06:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.924 12:06:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.924 12:06:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.924 12:06:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.924 12:06:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.924 12:06:29 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.924 12:06:29 -- accel/accel.sh@42 -- # jq -r . 00:06:42.924 [2024-11-02 12:06:29.672008] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:42.925 [2024-11-02 12:06:29.672107] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1135092 ] 00:06:42.925 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.925 [2024-11-02 12:06:29.740009] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.925 [2024-11-02 12:06:29.775845] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.300 12:06:30 -- accel/accel.sh@18 -- # out=' 00:06:44.300 SPDK Configuration: 00:06:44.300 Core mask: 0x1 00:06:44.300 00:06:44.300 Accel Perf Configuration: 00:06:44.300 Workload Type: copy_crc32c 00:06:44.300 CRC-32C seed: 0 00:06:44.300 Vector size: 4096 bytes 00:06:44.300 Transfer size: 4096 bytes 00:06:44.300 Vector count 1 00:06:44.300 Module: software 00:06:44.300 Queue depth: 32 00:06:44.300 Allocate depth: 32 00:06:44.300 # threads/core: 1 00:06:44.300 Run time: 1 seconds 00:06:44.300 Verify: Yes 00:06:44.300 00:06:44.300 Running for 1 seconds... 00:06:44.300 00:06:44.300 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:44.300 ------------------------------------------------------------------------------------ 00:06:44.300 0,0 401344/s 1567 MiB/s 0 0 00:06:44.300 ==================================================================================== 00:06:44.300 Total 401344/s 1567 MiB/s 0 0' 00:06:44.300 12:06:30 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:30 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:44.300 12:06:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:44.300 12:06:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.300 12:06:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.300 12:06:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.300 12:06:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.300 12:06:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.300 12:06:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.300 12:06:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.300 12:06:30 -- accel/accel.sh@42 -- # jq -r . 00:06:44.300 [2024-11-02 12:06:30.957303] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:44.300 [2024-11-02 12:06:30.957402] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1135232 ] 00:06:44.300 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.300 [2024-11-02 12:06:31.023638] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.300 [2024-11-02 12:06:31.058228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val= 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val= 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=0x1 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val= 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val= 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=0 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val= 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=software 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=32 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=32 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=1 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val=Yes 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val= 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:44.300 12:06:31 -- accel/accel.sh@21 -- # val= 00:06:44.300 12:06:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # IFS=: 00:06:44.300 12:06:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.676 12:06:32 -- accel/accel.sh@21 -- # val= 00:06:45.676 12:06:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # IFS=: 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # read -r var val 00:06:45.676 12:06:32 -- accel/accel.sh@21 -- # val= 00:06:45.676 12:06:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # IFS=: 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # read -r var val 00:06:45.676 12:06:32 -- accel/accel.sh@21 -- # val= 00:06:45.676 12:06:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # IFS=: 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # read -r var val 00:06:45.676 12:06:32 -- accel/accel.sh@21 -- # val= 00:06:45.676 12:06:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # IFS=: 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # read -r var val 00:06:45.676 12:06:32 -- accel/accel.sh@21 -- # val= 00:06:45.676 12:06:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # IFS=: 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # read -r var val 00:06:45.676 12:06:32 -- accel/accel.sh@21 -- # val= 00:06:45.676 12:06:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # IFS=: 00:06:45.676 12:06:32 -- accel/accel.sh@20 -- # read -r var val 00:06:45.676 12:06:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:45.676 12:06:32 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:45.676 12:06:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.676 00:06:45.676 real 0m2.573s 00:06:45.676 user 0m2.333s 00:06:45.676 sys 0m0.249s 00:06:45.676 12:06:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:45.676 12:06:32 -- common/autotest_common.sh@10 -- # set +x 00:06:45.676 ************************************ 00:06:45.676 END TEST accel_copy_crc32c 00:06:45.676 ************************************ 00:06:45.676 12:06:32 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:45.676 12:06:32 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:45.676 12:06:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:45.676 12:06:32 -- common/autotest_common.sh@10 -- # set +x 00:06:45.676 ************************************ 00:06:45.676 START TEST accel_copy_crc32c_C2 00:06:45.676 ************************************ 00:06:45.676 12:06:32 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:45.676 12:06:32 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.676 12:06:32 -- accel/accel.sh@17 -- # local accel_module 00:06:45.676 12:06:32 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:45.676 12:06:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:45.676 12:06:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.676 12:06:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.676 12:06:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.676 12:06:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.676 12:06:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.676 12:06:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.676 12:06:32 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.676 12:06:32 -- accel/accel.sh@42 -- # jq -r . 00:06:45.676 [2024-11-02 12:06:32.296185] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:45.676 [2024-11-02 12:06:32.296280] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1135471 ] 00:06:45.676 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.676 [2024-11-02 12:06:32.367179] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.676 [2024-11-02 12:06:32.402648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.611 12:06:33 -- accel/accel.sh@18 -- # out=' 00:06:46.611 SPDK Configuration: 00:06:46.611 Core mask: 0x1 00:06:46.611 00:06:46.611 Accel Perf Configuration: 00:06:46.611 Workload Type: copy_crc32c 00:06:46.611 CRC-32C seed: 0 00:06:46.611 Vector size: 4096 bytes 00:06:46.611 Transfer size: 8192 bytes 00:06:46.611 Vector count 2 00:06:46.611 Module: software 00:06:46.611 Queue depth: 32 00:06:46.611 Allocate depth: 32 00:06:46.611 # threads/core: 1 00:06:46.611 Run time: 1 seconds 00:06:46.611 Verify: Yes 00:06:46.611 00:06:46.611 Running for 1 seconds... 00:06:46.611 00:06:46.611 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:46.611 ------------------------------------------------------------------------------------ 00:06:46.611 0,0 296224/s 2314 MiB/s 0 0 00:06:46.611 ==================================================================================== 00:06:46.611 Total 296224/s 1157 MiB/s 0 0' 00:06:46.611 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.611 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.611 12:06:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:46.611 12:06:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:46.611 12:06:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.611 12:06:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.611 12:06:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.611 12:06:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.611 12:06:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.611 12:06:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.611 12:06:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.611 12:06:33 -- accel/accel.sh@42 -- # jq -r . 00:06:46.611 [2024-11-02 12:06:33.583981] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:46.611 [2024-11-02 12:06:33.584084] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1135749 ] 00:06:46.870 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.870 [2024-11-02 12:06:33.650474] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.870 [2024-11-02 12:06:33.684782] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val= 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val= 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=0x1 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val= 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val= 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=0 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val= 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=software 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@23 -- # accel_module=software 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=32 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=32 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=1 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val=Yes 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val= 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:46.870 12:06:33 -- accel/accel.sh@21 -- # val= 00:06:46.870 12:06:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # IFS=: 00:06:46.870 12:06:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.246 12:06:34 -- accel/accel.sh@21 -- # val= 00:06:48.246 12:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # IFS=: 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # read -r var val 00:06:48.246 12:06:34 -- accel/accel.sh@21 -- # val= 00:06:48.246 12:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # IFS=: 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # read -r var val 00:06:48.246 12:06:34 -- accel/accel.sh@21 -- # val= 00:06:48.246 12:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # IFS=: 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # read -r var val 00:06:48.246 12:06:34 -- accel/accel.sh@21 -- # val= 00:06:48.246 12:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # IFS=: 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # read -r var val 00:06:48.246 12:06:34 -- accel/accel.sh@21 -- # val= 00:06:48.246 12:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # IFS=: 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # read -r var val 00:06:48.246 12:06:34 -- accel/accel.sh@21 -- # val= 00:06:48.246 12:06:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # IFS=: 00:06:48.246 12:06:34 -- accel/accel.sh@20 -- # read -r var val 00:06:48.246 12:06:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:48.246 12:06:34 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:48.246 12:06:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.246 00:06:48.246 real 0m2.579s 00:06:48.246 user 0m2.333s 00:06:48.246 sys 0m0.256s 00:06:48.246 12:06:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.246 12:06:34 -- common/autotest_common.sh@10 -- # set +x 00:06:48.246 ************************************ 00:06:48.246 END TEST accel_copy_crc32c_C2 00:06:48.246 ************************************ 00:06:48.246 12:06:34 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:48.246 12:06:34 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:48.246 12:06:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:48.246 12:06:34 -- common/autotest_common.sh@10 -- # set +x 00:06:48.246 ************************************ 00:06:48.246 START TEST accel_dualcast 00:06:48.246 ************************************ 00:06:48.246 12:06:34 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:48.246 12:06:34 -- accel/accel.sh@16 -- # local accel_opc 00:06:48.246 12:06:34 -- accel/accel.sh@17 -- # local accel_module 00:06:48.246 12:06:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:48.246 12:06:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:48.246 12:06:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.246 12:06:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.246 12:06:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.246 12:06:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.246 12:06:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.246 12:06:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.246 12:06:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.246 12:06:34 -- accel/accel.sh@42 -- # jq -r . 00:06:48.246 [2024-11-02 12:06:34.920556] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:48.246 [2024-11-02 12:06:34.920651] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136031 ] 00:06:48.246 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.246 [2024-11-02 12:06:34.987719] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.246 [2024-11-02 12:06:35.022985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.622 12:06:36 -- accel/accel.sh@18 -- # out=' 00:06:49.622 SPDK Configuration: 00:06:49.622 Core mask: 0x1 00:06:49.622 00:06:49.622 Accel Perf Configuration: 00:06:49.622 Workload Type: dualcast 00:06:49.622 Transfer size: 4096 bytes 00:06:49.622 Vector count 1 00:06:49.622 Module: software 00:06:49.622 Queue depth: 32 00:06:49.622 Allocate depth: 32 00:06:49.622 # threads/core: 1 00:06:49.622 Run time: 1 seconds 00:06:49.622 Verify: Yes 00:06:49.622 00:06:49.622 Running for 1 seconds... 00:06:49.622 00:06:49.622 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.622 ------------------------------------------------------------------------------------ 00:06:49.622 0,0 631456/s 2466 MiB/s 0 0 00:06:49.622 ==================================================================================== 00:06:49.622 Total 631456/s 2466 MiB/s 0 0' 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:49.622 12:06:36 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.622 12:06:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.622 12:06:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.622 12:06:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.622 12:06:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.622 12:06:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.622 12:06:36 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.622 12:06:36 -- accel/accel.sh@42 -- # jq -r . 00:06:49.622 [2024-11-02 12:06:36.204101] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:49.622 [2024-11-02 12:06:36.204198] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136297 ] 00:06:49.622 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.622 [2024-11-02 12:06:36.270580] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.622 [2024-11-02 12:06:36.304872] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val= 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val= 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val=0x1 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val= 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val= 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val=dualcast 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val= 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val=software 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val=32 00:06:49.622 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.622 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.622 12:06:36 -- accel/accel.sh@21 -- # val=32 00:06:49.623 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.623 12:06:36 -- accel/accel.sh@21 -- # val=1 00:06:49.623 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.623 12:06:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.623 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.623 12:06:36 -- accel/accel.sh@21 -- # val=Yes 00:06:49.623 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.623 12:06:36 -- accel/accel.sh@21 -- # val= 00:06:49.623 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:49.623 12:06:36 -- accel/accel.sh@21 -- # val= 00:06:49.623 12:06:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # IFS=: 00:06:49.623 12:06:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.558 12:06:37 -- accel/accel.sh@21 -- # val= 00:06:50.558 12:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # IFS=: 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # read -r var val 00:06:50.558 12:06:37 -- accel/accel.sh@21 -- # val= 00:06:50.558 12:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # IFS=: 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # read -r var val 00:06:50.558 12:06:37 -- accel/accel.sh@21 -- # val= 00:06:50.558 12:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # IFS=: 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # read -r var val 00:06:50.558 12:06:37 -- accel/accel.sh@21 -- # val= 00:06:50.558 12:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # IFS=: 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # read -r var val 00:06:50.558 12:06:37 -- accel/accel.sh@21 -- # val= 00:06:50.558 12:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # IFS=: 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # read -r var val 00:06:50.558 12:06:37 -- accel/accel.sh@21 -- # val= 00:06:50.558 12:06:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # IFS=: 00:06:50.558 12:06:37 -- accel/accel.sh@20 -- # read -r var val 00:06:50.558 12:06:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:50.558 12:06:37 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:50.558 12:06:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.558 00:06:50.558 real 0m2.571s 00:06:50.558 user 0m2.332s 00:06:50.558 sys 0m0.247s 00:06:50.558 12:06:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.558 12:06:37 -- common/autotest_common.sh@10 -- # set +x 00:06:50.558 ************************************ 00:06:50.558 END TEST accel_dualcast 00:06:50.558 ************************************ 00:06:50.558 12:06:37 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:50.558 12:06:37 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:50.558 12:06:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:50.558 12:06:37 -- common/autotest_common.sh@10 -- # set +x 00:06:50.558 ************************************ 00:06:50.558 START TEST accel_compare 00:06:50.558 ************************************ 00:06:50.558 12:06:37 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:06:50.558 12:06:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:50.558 12:06:37 -- accel/accel.sh@17 -- # local accel_module 00:06:50.558 12:06:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:50.558 12:06:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:50.558 12:06:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.558 12:06:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.558 12:06:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.558 12:06:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.558 12:06:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.558 12:06:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.558 12:06:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.558 12:06:37 -- accel/accel.sh@42 -- # jq -r . 00:06:50.817 [2024-11-02 12:06:37.540575] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:50.817 [2024-11-02 12:06:37.540669] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136548 ] 00:06:50.817 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.817 [2024-11-02 12:06:37.608610] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.817 [2024-11-02 12:06:37.644088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.193 12:06:38 -- accel/accel.sh@18 -- # out=' 00:06:52.193 SPDK Configuration: 00:06:52.193 Core mask: 0x1 00:06:52.193 00:06:52.193 Accel Perf Configuration: 00:06:52.193 Workload Type: compare 00:06:52.193 Transfer size: 4096 bytes 00:06:52.193 Vector count 1 00:06:52.193 Module: software 00:06:52.193 Queue depth: 32 00:06:52.193 Allocate depth: 32 00:06:52.193 # threads/core: 1 00:06:52.193 Run time: 1 seconds 00:06:52.193 Verify: Yes 00:06:52.193 00:06:52.193 Running for 1 seconds... 00:06:52.193 00:06:52.193 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:52.193 ------------------------------------------------------------------------------------ 00:06:52.193 0,0 835360/s 3263 MiB/s 0 0 00:06:52.193 ==================================================================================== 00:06:52.193 Total 835360/s 3263 MiB/s 0 0' 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.193 12:06:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:52.193 12:06:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:52.193 12:06:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.193 12:06:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.193 12:06:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.193 12:06:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.193 12:06:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.193 12:06:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.193 12:06:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.193 12:06:38 -- accel/accel.sh@42 -- # jq -r . 00:06:52.193 [2024-11-02 12:06:38.824821] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:52.193 [2024-11-02 12:06:38.824917] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136704 ] 00:06:52.193 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.193 [2024-11-02 12:06:38.891858] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.193 [2024-11-02 12:06:38.926329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.193 12:06:38 -- accel/accel.sh@21 -- # val= 00:06:52.193 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.193 12:06:38 -- accel/accel.sh@21 -- # val= 00:06:52.193 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.193 12:06:38 -- accel/accel.sh@21 -- # val=0x1 00:06:52.193 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.193 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.193 12:06:38 -- accel/accel.sh@21 -- # val= 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val= 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val=compare 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val= 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val=software 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@23 -- # accel_module=software 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val=32 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val=32 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val=1 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val=Yes 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val= 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:52.194 12:06:38 -- accel/accel.sh@21 -- # val= 00:06:52.194 12:06:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # IFS=: 00:06:52.194 12:06:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.129 12:06:40 -- accel/accel.sh@21 -- # val= 00:06:53.129 12:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # IFS=: 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # read -r var val 00:06:53.129 12:06:40 -- accel/accel.sh@21 -- # val= 00:06:53.129 12:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # IFS=: 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # read -r var val 00:06:53.129 12:06:40 -- accel/accel.sh@21 -- # val= 00:06:53.129 12:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # IFS=: 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # read -r var val 00:06:53.129 12:06:40 -- accel/accel.sh@21 -- # val= 00:06:53.129 12:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # IFS=: 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # read -r var val 00:06:53.129 12:06:40 -- accel/accel.sh@21 -- # val= 00:06:53.129 12:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # IFS=: 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # read -r var val 00:06:53.129 12:06:40 -- accel/accel.sh@21 -- # val= 00:06:53.129 12:06:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # IFS=: 00:06:53.129 12:06:40 -- accel/accel.sh@20 -- # read -r var val 00:06:53.129 12:06:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:53.129 12:06:40 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:53.129 12:06:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.129 00:06:53.129 real 0m2.576s 00:06:53.129 user 0m2.330s 00:06:53.129 sys 0m0.256s 00:06:53.129 12:06:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.129 12:06:40 -- common/autotest_common.sh@10 -- # set +x 00:06:53.129 ************************************ 00:06:53.129 END TEST accel_compare 00:06:53.129 ************************************ 00:06:53.388 12:06:40 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:53.388 12:06:40 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:53.388 12:06:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:53.388 12:06:40 -- common/autotest_common.sh@10 -- # set +x 00:06:53.388 ************************************ 00:06:53.388 START TEST accel_xor 00:06:53.388 ************************************ 00:06:53.388 12:06:40 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:06:53.388 12:06:40 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.388 12:06:40 -- accel/accel.sh@17 -- # local accel_module 00:06:53.388 12:06:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:53.388 12:06:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:53.388 12:06:40 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.388 12:06:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.388 12:06:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.388 12:06:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.388 12:06:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.388 12:06:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.388 12:06:40 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.388 12:06:40 -- accel/accel.sh@42 -- # jq -r . 00:06:53.388 [2024-11-02 12:06:40.166211] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:53.388 [2024-11-02 12:06:40.166303] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136898 ] 00:06:53.388 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.388 [2024-11-02 12:06:40.235197] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.389 [2024-11-02 12:06:40.271240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.764 12:06:41 -- accel/accel.sh@18 -- # out=' 00:06:54.764 SPDK Configuration: 00:06:54.764 Core mask: 0x1 00:06:54.764 00:06:54.764 Accel Perf Configuration: 00:06:54.764 Workload Type: xor 00:06:54.764 Source buffers: 2 00:06:54.764 Transfer size: 4096 bytes 00:06:54.764 Vector count 1 00:06:54.764 Module: software 00:06:54.764 Queue depth: 32 00:06:54.764 Allocate depth: 32 00:06:54.764 # threads/core: 1 00:06:54.764 Run time: 1 seconds 00:06:54.764 Verify: Yes 00:06:54.764 00:06:54.764 Running for 1 seconds... 00:06:54.764 00:06:54.764 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:54.764 ------------------------------------------------------------------------------------ 00:06:54.764 0,0 691456/s 2701 MiB/s 0 0 00:06:54.764 ==================================================================================== 00:06:54.764 Total 691456/s 2701 MiB/s 0 0' 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:54.764 12:06:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:54.764 12:06:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.764 12:06:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.764 12:06:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.764 12:06:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.764 12:06:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.764 12:06:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.764 12:06:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.764 12:06:41 -- accel/accel.sh@42 -- # jq -r . 00:06:54.764 [2024-11-02 12:06:41.453047] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:54.764 [2024-11-02 12:06:41.453140] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1137158 ] 00:06:54.764 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.764 [2024-11-02 12:06:41.520373] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.764 [2024-11-02 12:06:41.554852] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val= 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val= 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=0x1 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val= 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val= 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=xor 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=2 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val= 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=software 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@23 -- # accel_module=software 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=32 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=32 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=1 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val=Yes 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val= 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:54.764 12:06:41 -- accel/accel.sh@21 -- # val= 00:06:54.764 12:06:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # IFS=: 00:06:54.764 12:06:41 -- accel/accel.sh@20 -- # read -r var val 00:06:56.140 12:06:42 -- accel/accel.sh@21 -- # val= 00:06:56.140 12:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # IFS=: 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # read -r var val 00:06:56.140 12:06:42 -- accel/accel.sh@21 -- # val= 00:06:56.140 12:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # IFS=: 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # read -r var val 00:06:56.140 12:06:42 -- accel/accel.sh@21 -- # val= 00:06:56.140 12:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # IFS=: 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # read -r var val 00:06:56.140 12:06:42 -- accel/accel.sh@21 -- # val= 00:06:56.140 12:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # IFS=: 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # read -r var val 00:06:56.140 12:06:42 -- accel/accel.sh@21 -- # val= 00:06:56.140 12:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # IFS=: 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # read -r var val 00:06:56.140 12:06:42 -- accel/accel.sh@21 -- # val= 00:06:56.140 12:06:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # IFS=: 00:06:56.140 12:06:42 -- accel/accel.sh@20 -- # read -r var val 00:06:56.140 12:06:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.140 12:06:42 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:56.140 12:06:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.140 00:06:56.140 real 0m2.577s 00:06:56.140 user 0m2.336s 00:06:56.140 sys 0m0.249s 00:06:56.140 12:06:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.140 12:06:42 -- common/autotest_common.sh@10 -- # set +x 00:06:56.140 ************************************ 00:06:56.140 END TEST accel_xor 00:06:56.140 ************************************ 00:06:56.140 12:06:42 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:56.140 12:06:42 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:56.140 12:06:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.140 12:06:42 -- common/autotest_common.sh@10 -- # set +x 00:06:56.140 ************************************ 00:06:56.140 START TEST accel_xor 00:06:56.140 ************************************ 00:06:56.140 12:06:42 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:06:56.140 12:06:42 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.140 12:06:42 -- accel/accel.sh@17 -- # local accel_module 00:06:56.140 12:06:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:56.140 12:06:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:56.140 12:06:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.140 12:06:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.140 12:06:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.140 12:06:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.140 12:06:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.140 12:06:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.140 12:06:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.140 12:06:42 -- accel/accel.sh@42 -- # jq -r . 00:06:56.140 [2024-11-02 12:06:42.788825] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:56.141 [2024-11-02 12:06:42.788916] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1137452 ] 00:06:56.141 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.141 [2024-11-02 12:06:42.855594] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.141 [2024-11-02 12:06:42.890814] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.075 12:06:44 -- accel/accel.sh@18 -- # out=' 00:06:57.075 SPDK Configuration: 00:06:57.075 Core mask: 0x1 00:06:57.075 00:06:57.075 Accel Perf Configuration: 00:06:57.075 Workload Type: xor 00:06:57.075 Source buffers: 3 00:06:57.075 Transfer size: 4096 bytes 00:06:57.075 Vector count 1 00:06:57.075 Module: software 00:06:57.075 Queue depth: 32 00:06:57.075 Allocate depth: 32 00:06:57.075 # threads/core: 1 00:06:57.075 Run time: 1 seconds 00:06:57.075 Verify: Yes 00:06:57.075 00:06:57.075 Running for 1 seconds... 00:06:57.075 00:06:57.075 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:57.075 ------------------------------------------------------------------------------------ 00:06:57.075 0,0 649632/s 2537 MiB/s 0 0 00:06:57.075 ==================================================================================== 00:06:57.075 Total 649632/s 2537 MiB/s 0 0' 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:57.334 12:06:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:57.334 12:06:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.334 12:06:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.334 12:06:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.334 12:06:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.334 12:06:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.334 12:06:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.334 12:06:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.334 12:06:44 -- accel/accel.sh@42 -- # jq -r . 00:06:57.334 [2024-11-02 12:06:44.070706] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:57.334 [2024-11-02 12:06:44.070796] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1137718 ] 00:06:57.334 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.334 [2024-11-02 12:06:44.137419] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.334 [2024-11-02 12:06:44.171800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val= 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val= 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=0x1 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val= 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val= 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=xor 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=3 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val= 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=software 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@23 -- # accel_module=software 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=32 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=32 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=1 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val=Yes 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val= 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:57.334 12:06:44 -- accel/accel.sh@21 -- # val= 00:06:57.334 12:06:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # IFS=: 00:06:57.334 12:06:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.717 12:06:45 -- accel/accel.sh@21 -- # val= 00:06:58.717 12:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # IFS=: 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # read -r var val 00:06:58.717 12:06:45 -- accel/accel.sh@21 -- # val= 00:06:58.717 12:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # IFS=: 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # read -r var val 00:06:58.717 12:06:45 -- accel/accel.sh@21 -- # val= 00:06:58.717 12:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # IFS=: 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # read -r var val 00:06:58.717 12:06:45 -- accel/accel.sh@21 -- # val= 00:06:58.717 12:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # IFS=: 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # read -r var val 00:06:58.717 12:06:45 -- accel/accel.sh@21 -- # val= 00:06:58.717 12:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # IFS=: 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # read -r var val 00:06:58.717 12:06:45 -- accel/accel.sh@21 -- # val= 00:06:58.717 12:06:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # IFS=: 00:06:58.717 12:06:45 -- accel/accel.sh@20 -- # read -r var val 00:06:58.717 12:06:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:58.717 12:06:45 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:58.717 12:06:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.717 00:06:58.717 real 0m2.569s 00:06:58.717 user 0m2.321s 00:06:58.717 sys 0m0.256s 00:06:58.717 12:06:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.717 12:06:45 -- common/autotest_common.sh@10 -- # set +x 00:06:58.717 ************************************ 00:06:58.717 END TEST accel_xor 00:06:58.717 ************************************ 00:06:58.717 12:06:45 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:58.717 12:06:45 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:06:58.717 12:06:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:58.717 12:06:45 -- common/autotest_common.sh@10 -- # set +x 00:06:58.717 ************************************ 00:06:58.717 START TEST accel_dif_verify 00:06:58.717 ************************************ 00:06:58.717 12:06:45 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:06:58.717 12:06:45 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.717 12:06:45 -- accel/accel.sh@17 -- # local accel_module 00:06:58.717 12:06:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:58.717 12:06:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:58.717 12:06:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.717 12:06:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.717 12:06:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.717 12:06:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.717 12:06:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.717 12:06:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.717 12:06:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.717 12:06:45 -- accel/accel.sh@42 -- # jq -r . 00:06:58.717 [2024-11-02 12:06:45.409466] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:58.717 [2024-11-02 12:06:45.409558] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1138002 ] 00:06:58.717 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.717 [2024-11-02 12:06:45.479725] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.717 [2024-11-02 12:06:45.515968] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.809 12:06:46 -- accel/accel.sh@18 -- # out=' 00:06:59.809 SPDK Configuration: 00:06:59.809 Core mask: 0x1 00:06:59.809 00:06:59.809 Accel Perf Configuration: 00:06:59.809 Workload Type: dif_verify 00:06:59.809 Vector size: 4096 bytes 00:06:59.809 Transfer size: 4096 bytes 00:06:59.809 Block size: 512 bytes 00:06:59.809 Metadata size: 8 bytes 00:06:59.809 Vector count 1 00:06:59.809 Module: software 00:06:59.809 Queue depth: 32 00:06:59.809 Allocate depth: 32 00:06:59.809 # threads/core: 1 00:06:59.809 Run time: 1 seconds 00:06:59.809 Verify: No 00:06:59.809 00:06:59.809 Running for 1 seconds... 00:06:59.809 00:06:59.809 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:59.809 ------------------------------------------------------------------------------------ 00:06:59.809 0,0 244032/s 968 MiB/s 0 0 00:06:59.809 ==================================================================================== 00:06:59.809 Total 244032/s 953 MiB/s 0 0' 00:06:59.809 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:06:59.809 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:06:59.809 12:06:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:59.809 12:06:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:59.809 12:06:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.809 12:06:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.809 12:06:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.809 12:06:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.809 12:06:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.809 12:06:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.809 12:06:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.809 12:06:46 -- accel/accel.sh@42 -- # jq -r . 00:06:59.809 [2024-11-02 12:06:46.696565] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:06:59.809 [2024-11-02 12:06:46.696674] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1138167 ] 00:06:59.809 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.809 [2024-11-02 12:06:46.763616] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.068 [2024-11-02 12:06:46.798227] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val= 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val= 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val=0x1 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val= 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val= 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val=dif_verify 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val= 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val=software 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val=32 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val=32 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val=1 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val=No 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val= 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:00.068 12:06:46 -- accel/accel.sh@21 -- # val= 00:07:00.068 12:06:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # IFS=: 00:07:00.068 12:06:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.004 12:06:47 -- accel/accel.sh@21 -- # val= 00:07:01.004 12:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:01.004 12:06:47 -- accel/accel.sh@21 -- # val= 00:07:01.004 12:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:01.004 12:06:47 -- accel/accel.sh@21 -- # val= 00:07:01.004 12:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:01.004 12:06:47 -- accel/accel.sh@21 -- # val= 00:07:01.004 12:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:01.004 12:06:47 -- accel/accel.sh@21 -- # val= 00:07:01.004 12:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:01.004 12:06:47 -- accel/accel.sh@21 -- # val= 00:07:01.004 12:06:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # IFS=: 00:07:01.004 12:06:47 -- accel/accel.sh@20 -- # read -r var val 00:07:01.004 12:06:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.004 12:06:47 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:01.004 12:06:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.004 00:07:01.004 real 0m2.577s 00:07:01.004 user 0m2.327s 00:07:01.004 sys 0m0.259s 00:07:01.004 12:06:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:01.004 12:06:47 -- common/autotest_common.sh@10 -- # set +x 00:07:01.004 ************************************ 00:07:01.004 END TEST accel_dif_verify 00:07:01.004 ************************************ 00:07:01.262 12:06:48 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:01.262 12:06:48 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:01.262 12:06:48 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:01.262 12:06:48 -- common/autotest_common.sh@10 -- # set +x 00:07:01.262 ************************************ 00:07:01.262 START TEST accel_dif_generate 00:07:01.262 ************************************ 00:07:01.262 12:06:48 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:07:01.262 12:06:48 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.262 12:06:48 -- accel/accel.sh@17 -- # local accel_module 00:07:01.262 12:06:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:01.262 12:06:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:01.262 12:06:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.262 12:06:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.262 12:06:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.262 12:06:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.262 12:06:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.263 12:06:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.263 12:06:48 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.263 12:06:48 -- accel/accel.sh@42 -- # jq -r . 00:07:01.263 [2024-11-02 12:06:48.033887] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:01.263 [2024-11-02 12:06:48.033981] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1138350 ] 00:07:01.263 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.263 [2024-11-02 12:06:48.102917] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.263 [2024-11-02 12:06:48.138898] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.637 12:06:49 -- accel/accel.sh@18 -- # out=' 00:07:02.637 SPDK Configuration: 00:07:02.637 Core mask: 0x1 00:07:02.637 00:07:02.637 Accel Perf Configuration: 00:07:02.637 Workload Type: dif_generate 00:07:02.637 Vector size: 4096 bytes 00:07:02.637 Transfer size: 4096 bytes 00:07:02.637 Block size: 512 bytes 00:07:02.637 Metadata size: 8 bytes 00:07:02.637 Vector count 1 00:07:02.637 Module: software 00:07:02.637 Queue depth: 32 00:07:02.637 Allocate depth: 32 00:07:02.637 # threads/core: 1 00:07:02.637 Run time: 1 seconds 00:07:02.637 Verify: No 00:07:02.637 00:07:02.637 Running for 1 seconds... 00:07:02.637 00:07:02.637 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:02.637 ------------------------------------------------------------------------------------ 00:07:02.637 0,0 280704/s 1113 MiB/s 0 0 00:07:02.637 ==================================================================================== 00:07:02.637 Total 280704/s 1096 MiB/s 0 0' 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:02.637 12:06:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:02.637 12:06:49 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.637 12:06:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.637 12:06:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.637 12:06:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.637 12:06:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.637 12:06:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.637 12:06:49 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.637 12:06:49 -- accel/accel.sh@42 -- # jq -r . 00:07:02.637 [2024-11-02 12:06:49.319928] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:02.637 [2024-11-02 12:06:49.320030] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1138580 ] 00:07:02.637 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.637 [2024-11-02 12:06:49.387057] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.637 [2024-11-02 12:06:49.421161] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val= 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val= 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val=0x1 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val= 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val= 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val=dif_generate 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.637 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.637 12:06:49 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:02.637 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val= 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val=software 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@23 -- # accel_module=software 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val=32 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val=32 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val=1 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val=No 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val= 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:02.638 12:06:49 -- accel/accel.sh@21 -- # val= 00:07:02.638 12:06:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # IFS=: 00:07:02.638 12:06:49 -- accel/accel.sh@20 -- # read -r var val 00:07:04.013 12:06:50 -- accel/accel.sh@21 -- # val= 00:07:04.013 12:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.013 12:06:50 -- accel/accel.sh@21 -- # val= 00:07:04.013 12:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.013 12:06:50 -- accel/accel.sh@21 -- # val= 00:07:04.013 12:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.013 12:06:50 -- accel/accel.sh@21 -- # val= 00:07:04.013 12:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.013 12:06:50 -- accel/accel.sh@21 -- # val= 00:07:04.013 12:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.013 12:06:50 -- accel/accel.sh@21 -- # val= 00:07:04.013 12:06:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.013 12:06:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.013 12:06:50 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.013 12:06:50 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:04.013 12:06:50 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.013 00:07:04.013 real 0m2.577s 00:07:04.013 user 0m2.325s 00:07:04.013 sys 0m0.263s 00:07:04.013 12:06:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.013 12:06:50 -- common/autotest_common.sh@10 -- # set +x 00:07:04.013 ************************************ 00:07:04.013 END TEST accel_dif_generate 00:07:04.013 ************************************ 00:07:04.013 12:06:50 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:04.014 12:06:50 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:04.014 12:06:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:04.014 12:06:50 -- common/autotest_common.sh@10 -- # set +x 00:07:04.014 ************************************ 00:07:04.014 START TEST accel_dif_generate_copy 00:07:04.014 ************************************ 00:07:04.014 12:06:50 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:07:04.014 12:06:50 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.014 12:06:50 -- accel/accel.sh@17 -- # local accel_module 00:07:04.014 12:06:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:04.014 12:06:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:04.014 12:06:50 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.014 12:06:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.014 12:06:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.014 12:06:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.014 12:06:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.014 12:06:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.014 12:06:50 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.014 12:06:50 -- accel/accel.sh@42 -- # jq -r . 00:07:04.014 [2024-11-02 12:06:50.658607] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:04.014 [2024-11-02 12:06:50.658694] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1138865 ] 00:07:04.014 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.014 [2024-11-02 12:06:50.728760] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.014 [2024-11-02 12:06:50.764454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.389 12:06:51 -- accel/accel.sh@18 -- # out=' 00:07:05.389 SPDK Configuration: 00:07:05.389 Core mask: 0x1 00:07:05.389 00:07:05.389 Accel Perf Configuration: 00:07:05.389 Workload Type: dif_generate_copy 00:07:05.389 Vector size: 4096 bytes 00:07:05.389 Transfer size: 4096 bytes 00:07:05.389 Vector count 1 00:07:05.389 Module: software 00:07:05.389 Queue depth: 32 00:07:05.389 Allocate depth: 32 00:07:05.389 # threads/core: 1 00:07:05.389 Run time: 1 seconds 00:07:05.389 Verify: No 00:07:05.389 00:07:05.389 Running for 1 seconds... 00:07:05.389 00:07:05.389 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:05.389 ------------------------------------------------------------------------------------ 00:07:05.389 0,0 220480/s 874 MiB/s 0 0 00:07:05.389 ==================================================================================== 00:07:05.389 Total 220480/s 861 MiB/s 0 0' 00:07:05.389 12:06:51 -- accel/accel.sh@20 -- # IFS=: 00:07:05.389 12:06:51 -- accel/accel.sh@20 -- # read -r var val 00:07:05.389 12:06:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:05.389 12:06:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:05.389 12:06:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.389 12:06:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.389 12:06:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.389 12:06:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.389 12:06:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.389 12:06:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.389 12:06:51 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.389 12:06:51 -- accel/accel.sh@42 -- # jq -r . 00:07:05.389 [2024-11-02 12:06:51.944736] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:05.389 [2024-11-02 12:06:51.944827] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1139134 ] 00:07:05.389 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.389 [2024-11-02 12:06:52.011471] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.389 [2024-11-02 12:06:52.046332] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.389 12:06:52 -- accel/accel.sh@21 -- # val= 00:07:05.389 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.389 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.389 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.389 12:06:52 -- accel/accel.sh@21 -- # val= 00:07:05.389 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.389 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val=0x1 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val= 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val= 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val= 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val=software 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@23 -- # accel_module=software 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val=32 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val=32 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val=1 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val=No 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val= 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:05.390 12:06:52 -- accel/accel.sh@21 -- # val= 00:07:05.390 12:06:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # IFS=: 00:07:05.390 12:06:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.325 12:06:53 -- accel/accel.sh@21 -- # val= 00:07:06.325 12:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.325 12:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:06.325 12:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:06.325 12:06:53 -- accel/accel.sh@21 -- # val= 00:07:06.326 12:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:06.326 12:06:53 -- accel/accel.sh@21 -- # val= 00:07:06.326 12:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:06.326 12:06:53 -- accel/accel.sh@21 -- # val= 00:07:06.326 12:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:06.326 12:06:53 -- accel/accel.sh@21 -- # val= 00:07:06.326 12:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:06.326 12:06:53 -- accel/accel.sh@21 -- # val= 00:07:06.326 12:06:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # IFS=: 00:07:06.326 12:06:53 -- accel/accel.sh@20 -- # read -r var val 00:07:06.326 12:06:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:06.326 12:06:53 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:06.326 12:06:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:06.326 00:07:06.326 real 0m2.576s 00:07:06.326 user 0m2.312s 00:07:06.326 sys 0m0.272s 00:07:06.326 12:06:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.326 12:06:53 -- common/autotest_common.sh@10 -- # set +x 00:07:06.326 ************************************ 00:07:06.326 END TEST accel_dif_generate_copy 00:07:06.326 ************************************ 00:07:06.326 12:06:53 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:06.326 12:06:53 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.326 12:06:53 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:07:06.326 12:06:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:06.326 12:06:53 -- common/autotest_common.sh@10 -- # set +x 00:07:06.326 ************************************ 00:07:06.326 START TEST accel_comp 00:07:06.326 ************************************ 00:07:06.326 12:06:53 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.326 12:06:53 -- accel/accel.sh@16 -- # local accel_opc 00:07:06.326 12:06:53 -- accel/accel.sh@17 -- # local accel_module 00:07:06.326 12:06:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.326 12:06:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.326 12:06:53 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.326 12:06:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.326 12:06:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.326 12:06:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.326 12:06:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.326 12:06:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.326 12:06:53 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.326 12:06:53 -- accel/accel.sh@42 -- # jq -r . 00:07:06.326 [2024-11-02 12:06:53.284703] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:06.326 [2024-11-02 12:06:53.284798] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1139421 ] 00:07:06.584 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.584 [2024-11-02 12:06:53.352559] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.584 [2024-11-02 12:06:53.388130] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.961 12:06:54 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:07.961 00:07:07.961 SPDK Configuration: 00:07:07.961 Core mask: 0x1 00:07:07.961 00:07:07.961 Accel Perf Configuration: 00:07:07.961 Workload Type: compress 00:07:07.961 Transfer size: 4096 bytes 00:07:07.961 Vector count 1 00:07:07.961 Module: software 00:07:07.961 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.961 Queue depth: 32 00:07:07.961 Allocate depth: 32 00:07:07.961 # threads/core: 1 00:07:07.961 Run time: 1 seconds 00:07:07.961 Verify: No 00:07:07.961 00:07:07.961 Running for 1 seconds... 00:07:07.961 00:07:07.961 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:07.961 ------------------------------------------------------------------------------------ 00:07:07.961 0,0 66848/s 278 MiB/s 0 0 00:07:07.961 ==================================================================================== 00:07:07.961 Total 66848/s 261 MiB/s 0 0' 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.961 12:06:54 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.961 12:06:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.961 12:06:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.961 12:06:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.961 12:06:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.961 12:06:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.961 12:06:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.961 12:06:54 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.961 12:06:54 -- accel/accel.sh@42 -- # jq -r . 00:07:07.961 [2024-11-02 12:06:54.571882] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:07.961 [2024-11-02 12:06:54.571975] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1139638 ] 00:07:07.961 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.961 [2024-11-02 12:06:54.639344] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.961 [2024-11-02 12:06:54.673878] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=0x1 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=compress 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=software 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@23 -- # accel_module=software 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=32 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=32 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=1 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val=No 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:07.961 12:06:54 -- accel/accel.sh@21 -- # val= 00:07:07.961 12:06:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # IFS=: 00:07:07.961 12:06:54 -- accel/accel.sh@20 -- # read -r var val 00:07:08.897 12:06:55 -- accel/accel.sh@21 -- # val= 00:07:08.897 12:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:08.897 12:06:55 -- accel/accel.sh@21 -- # val= 00:07:08.897 12:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:08.897 12:06:55 -- accel/accel.sh@21 -- # val= 00:07:08.897 12:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:08.897 12:06:55 -- accel/accel.sh@21 -- # val= 00:07:08.897 12:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:08.897 12:06:55 -- accel/accel.sh@21 -- # val= 00:07:08.897 12:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:08.897 12:06:55 -- accel/accel.sh@21 -- # val= 00:07:08.897 12:06:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # IFS=: 00:07:08.897 12:06:55 -- accel/accel.sh@20 -- # read -r var val 00:07:08.897 12:06:55 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:08.897 12:06:55 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:08.897 12:06:55 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.897 00:07:08.897 real 0m2.581s 00:07:08.897 user 0m2.336s 00:07:08.897 sys 0m0.254s 00:07:08.897 12:06:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.897 12:06:55 -- common/autotest_common.sh@10 -- # set +x 00:07:08.897 ************************************ 00:07:08.897 END TEST accel_comp 00:07:08.897 ************************************ 00:07:09.157 12:06:55 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:09.157 12:06:55 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:09.157 12:06:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.157 12:06:55 -- common/autotest_common.sh@10 -- # set +x 00:07:09.157 ************************************ 00:07:09.157 START TEST accel_decomp 00:07:09.157 ************************************ 00:07:09.157 12:06:55 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:09.157 12:06:55 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.157 12:06:55 -- accel/accel.sh@17 -- # local accel_module 00:07:09.157 12:06:55 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:09.157 12:06:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:09.157 12:06:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.157 12:06:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.157 12:06:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.157 12:06:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.157 12:06:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.157 12:06:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.157 12:06:55 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.157 12:06:55 -- accel/accel.sh@42 -- # jq -r . 00:07:09.157 [2024-11-02 12:06:55.914890] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:09.157 [2024-11-02 12:06:55.914986] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1139819 ] 00:07:09.157 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.157 [2024-11-02 12:06:55.983527] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.157 [2024-11-02 12:06:56.019468] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.533 12:06:57 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:10.533 00:07:10.533 SPDK Configuration: 00:07:10.533 Core mask: 0x1 00:07:10.533 00:07:10.533 Accel Perf Configuration: 00:07:10.533 Workload Type: decompress 00:07:10.533 Transfer size: 4096 bytes 00:07:10.533 Vector count 1 00:07:10.533 Module: software 00:07:10.533 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.533 Queue depth: 32 00:07:10.533 Allocate depth: 32 00:07:10.533 # threads/core: 1 00:07:10.533 Run time: 1 seconds 00:07:10.533 Verify: Yes 00:07:10.533 00:07:10.534 Running for 1 seconds... 00:07:10.534 00:07:10.534 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:10.534 ------------------------------------------------------------------------------------ 00:07:10.534 0,0 93632/s 172 MiB/s 0 0 00:07:10.534 ==================================================================================== 00:07:10.534 Total 93632/s 365 MiB/s 0 0' 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:10.534 12:06:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:10.534 12:06:57 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.534 12:06:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.534 12:06:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.534 12:06:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.534 12:06:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.534 12:06:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.534 12:06:57 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.534 12:06:57 -- accel/accel.sh@42 -- # jq -r . 00:07:10.534 [2024-11-02 12:06:57.202623] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:10.534 [2024-11-02 12:06:57.202716] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1139995 ] 00:07:10.534 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.534 [2024-11-02 12:06:57.270534] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.534 [2024-11-02 12:06:57.305336] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=0x1 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=decompress 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=software 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@23 -- # accel_module=software 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=32 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=32 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=1 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val=Yes 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:10.534 12:06:57 -- accel/accel.sh@21 -- # val= 00:07:10.534 12:06:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # IFS=: 00:07:10.534 12:06:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.911 12:06:58 -- accel/accel.sh@21 -- # val= 00:07:11.911 12:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:11.911 12:06:58 -- accel/accel.sh@21 -- # val= 00:07:11.911 12:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:11.911 12:06:58 -- accel/accel.sh@21 -- # val= 00:07:11.911 12:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:11.911 12:06:58 -- accel/accel.sh@21 -- # val= 00:07:11.911 12:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:11.911 12:06:58 -- accel/accel.sh@21 -- # val= 00:07:11.911 12:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:11.911 12:06:58 -- accel/accel.sh@21 -- # val= 00:07:11.911 12:06:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # IFS=: 00:07:11.911 12:06:58 -- accel/accel.sh@20 -- # read -r var val 00:07:11.911 12:06:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:11.911 12:06:58 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:11.911 12:06:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:11.911 00:07:11.911 real 0m2.580s 00:07:11.911 user 0m2.326s 00:07:11.911 sys 0m0.262s 00:07:11.911 12:06:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.911 12:06:58 -- common/autotest_common.sh@10 -- # set +x 00:07:11.911 ************************************ 00:07:11.911 END TEST accel_decomp 00:07:11.911 ************************************ 00:07:11.911 12:06:58 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:11.911 12:06:58 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:11.911 12:06:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:11.911 12:06:58 -- common/autotest_common.sh@10 -- # set +x 00:07:11.911 ************************************ 00:07:11.911 START TEST accel_decmop_full 00:07:11.911 ************************************ 00:07:11.911 12:06:58 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:11.911 12:06:58 -- accel/accel.sh@16 -- # local accel_opc 00:07:11.911 12:06:58 -- accel/accel.sh@17 -- # local accel_module 00:07:11.911 12:06:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:11.911 12:06:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:11.911 12:06:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.911 12:06:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.911 12:06:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.911 12:06:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.911 12:06:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.911 12:06:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.911 12:06:58 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.911 12:06:58 -- accel/accel.sh@42 -- # jq -r . 00:07:11.911 [2024-11-02 12:06:58.544561] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:11.911 [2024-11-02 12:06:58.544657] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1140278 ] 00:07:11.911 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.911 [2024-11-02 12:06:58.613197] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.911 [2024-11-02 12:06:58.648379] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.859 12:06:59 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:12.859 00:07:12.859 SPDK Configuration: 00:07:12.859 Core mask: 0x1 00:07:12.859 00:07:12.859 Accel Perf Configuration: 00:07:12.859 Workload Type: decompress 00:07:12.859 Transfer size: 111250 bytes 00:07:12.859 Vector count 1 00:07:12.859 Module: software 00:07:12.859 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.859 Queue depth: 32 00:07:12.860 Allocate depth: 32 00:07:12.860 # threads/core: 1 00:07:12.860 Run time: 1 seconds 00:07:12.860 Verify: Yes 00:07:12.860 00:07:12.860 Running for 1 seconds... 00:07:12.860 00:07:12.860 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:12.860 ------------------------------------------------------------------------------------ 00:07:12.860 0,0 5888/s 243 MiB/s 0 0 00:07:12.860 ==================================================================================== 00:07:12.860 Total 5888/s 624 MiB/s 0 0' 00:07:12.860 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:12.860 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:12.860 12:06:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.860 12:06:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.860 12:06:59 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.860 12:06:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.860 12:06:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.860 12:06:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.860 12:06:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.860 12:06:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.860 12:06:59 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.860 12:06:59 -- accel/accel.sh@42 -- # jq -r . 00:07:13.122 [2024-11-02 12:06:59.839583] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:13.122 [2024-11-02 12:06:59.839675] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1140548 ] 00:07:13.122 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.122 [2024-11-02 12:06:59.906585] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.122 [2024-11-02 12:06:59.940677] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.122 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.122 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.122 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.122 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.122 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.122 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.122 12:06:59 -- accel/accel.sh@21 -- # val=0x1 00:07:13.122 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.122 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.122 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.122 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.122 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.122 12:06:59 -- accel/accel.sh@21 -- # val=decompress 00:07:13.122 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.122 12:06:59 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:13.122 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val=software 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@23 -- # accel_module=software 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val=32 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val=32 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val=1 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val=Yes 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.123 12:06:59 -- accel/accel.sh@21 -- # val= 00:07:13.123 12:06:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.123 12:06:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.498 12:07:01 -- accel/accel.sh@21 -- # val= 00:07:14.498 12:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:14.498 12:07:01 -- accel/accel.sh@21 -- # val= 00:07:14.498 12:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:14.498 12:07:01 -- accel/accel.sh@21 -- # val= 00:07:14.498 12:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:14.498 12:07:01 -- accel/accel.sh@21 -- # val= 00:07:14.498 12:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:14.498 12:07:01 -- accel/accel.sh@21 -- # val= 00:07:14.498 12:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:14.498 12:07:01 -- accel/accel.sh@21 -- # val= 00:07:14.498 12:07:01 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # IFS=: 00:07:14.498 12:07:01 -- accel/accel.sh@20 -- # read -r var val 00:07:14.498 12:07:01 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:14.498 12:07:01 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:14.498 12:07:01 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.498 00:07:14.498 real 0m2.594s 00:07:14.498 user 0m2.348s 00:07:14.498 sys 0m0.253s 00:07:14.498 12:07:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.498 12:07:01 -- common/autotest_common.sh@10 -- # set +x 00:07:14.498 ************************************ 00:07:14.498 END TEST accel_decmop_full 00:07:14.498 ************************************ 00:07:14.498 12:07:01 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.498 12:07:01 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:14.498 12:07:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:14.498 12:07:01 -- common/autotest_common.sh@10 -- # set +x 00:07:14.498 ************************************ 00:07:14.498 START TEST accel_decomp_mcore 00:07:14.498 ************************************ 00:07:14.498 12:07:01 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.498 12:07:01 -- accel/accel.sh@16 -- # local accel_opc 00:07:14.498 12:07:01 -- accel/accel.sh@17 -- # local accel_module 00:07:14.498 12:07:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.498 12:07:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:14.498 12:07:01 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.498 12:07:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.498 12:07:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.498 12:07:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.498 12:07:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.498 12:07:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.498 12:07:01 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.498 12:07:01 -- accel/accel.sh@42 -- # jq -r . 00:07:14.498 [2024-11-02 12:07:01.186200] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:14.498 [2024-11-02 12:07:01.186290] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1140832 ] 00:07:14.499 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.499 [2024-11-02 12:07:01.253906] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:14.499 [2024-11-02 12:07:01.291687] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.499 [2024-11-02 12:07:01.291783] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:14.499 [2024-11-02 12:07:01.291865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:14.499 [2024-11-02 12:07:01.291867] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.875 12:07:02 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:15.875 00:07:15.875 SPDK Configuration: 00:07:15.875 Core mask: 0xf 00:07:15.875 00:07:15.875 Accel Perf Configuration: 00:07:15.875 Workload Type: decompress 00:07:15.875 Transfer size: 4096 bytes 00:07:15.875 Vector count 1 00:07:15.875 Module: software 00:07:15.875 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.875 Queue depth: 32 00:07:15.875 Allocate depth: 32 00:07:15.875 # threads/core: 1 00:07:15.875 Run time: 1 seconds 00:07:15.875 Verify: Yes 00:07:15.875 00:07:15.875 Running for 1 seconds... 00:07:15.875 00:07:15.875 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:15.875 ------------------------------------------------------------------------------------ 00:07:15.875 0,0 77920/s 143 MiB/s 0 0 00:07:15.875 3,0 78240/s 144 MiB/s 0 0 00:07:15.875 2,0 78048/s 143 MiB/s 0 0 00:07:15.875 1,0 78144/s 143 MiB/s 0 0 00:07:15.875 ==================================================================================== 00:07:15.875 Total 312352/s 1220 MiB/s 0 0' 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.875 12:07:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.875 12:07:02 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.875 12:07:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.875 12:07:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.875 12:07:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.875 12:07:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.875 12:07:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.875 12:07:02 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.875 12:07:02 -- accel/accel.sh@42 -- # jq -r . 00:07:15.875 [2024-11-02 12:07:02.483368] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:15.875 [2024-11-02 12:07:02.483477] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1141107 ] 00:07:15.875 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.875 [2024-11-02 12:07:02.551619] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:15.875 [2024-11-02 12:07:02.588225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.875 [2024-11-02 12:07:02.588322] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:15.875 [2024-11-02 12:07:02.588383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:15.875 [2024-11-02 12:07:02.588385] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val=0xf 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val=decompress 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.875 12:07:02 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.875 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.875 12:07:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:15.875 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val=software 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@23 -- # accel_module=software 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val=32 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val=32 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val=1 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val=Yes 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:15.876 12:07:02 -- accel/accel.sh@21 -- # val= 00:07:15.876 12:07:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # IFS=: 00:07:15.876 12:07:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@21 -- # val= 00:07:16.810 12:07:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # IFS=: 00:07:16.810 12:07:03 -- accel/accel.sh@20 -- # read -r var val 00:07:16.810 12:07:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:16.810 12:07:03 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:16.810 12:07:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:16.810 00:07:16.810 real 0m2.603s 00:07:16.810 user 0m8.991s 00:07:16.810 sys 0m0.276s 00:07:16.810 12:07:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.810 12:07:03 -- common/autotest_common.sh@10 -- # set +x 00:07:16.810 ************************************ 00:07:16.810 END TEST accel_decomp_mcore 00:07:16.810 ************************************ 00:07:17.069 12:07:03 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:17.069 12:07:03 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:17.069 12:07:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.069 12:07:03 -- common/autotest_common.sh@10 -- # set +x 00:07:17.069 ************************************ 00:07:17.069 START TEST accel_decomp_full_mcore 00:07:17.069 ************************************ 00:07:17.069 12:07:03 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:17.069 12:07:03 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.069 12:07:03 -- accel/accel.sh@17 -- # local accel_module 00:07:17.069 12:07:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:17.069 12:07:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:17.069 12:07:03 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.069 12:07:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.069 12:07:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.069 12:07:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.069 12:07:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.069 12:07:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.069 12:07:03 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.069 12:07:03 -- accel/accel.sh@42 -- # jq -r . 00:07:17.069 [2024-11-02 12:07:03.838359] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:17.069 [2024-11-02 12:07:03.838452] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1141312 ] 00:07:17.069 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.069 [2024-11-02 12:07:03.905573] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:17.069 [2024-11-02 12:07:03.943567] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.069 [2024-11-02 12:07:03.943664] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:17.069 [2024-11-02 12:07:03.943739] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:17.069 [2024-11-02 12:07:03.943741] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.445 12:07:05 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:18.445 00:07:18.445 SPDK Configuration: 00:07:18.445 Core mask: 0xf 00:07:18.445 00:07:18.445 Accel Perf Configuration: 00:07:18.445 Workload Type: decompress 00:07:18.445 Transfer size: 111250 bytes 00:07:18.445 Vector count 1 00:07:18.445 Module: software 00:07:18.445 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.445 Queue depth: 32 00:07:18.445 Allocate depth: 32 00:07:18.445 # threads/core: 1 00:07:18.445 Run time: 1 seconds 00:07:18.445 Verify: Yes 00:07:18.445 00:07:18.445 Running for 1 seconds... 00:07:18.445 00:07:18.445 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:18.445 ------------------------------------------------------------------------------------ 00:07:18.445 0,0 5792/s 239 MiB/s 0 0 00:07:18.445 3,0 5824/s 240 MiB/s 0 0 00:07:18.445 2,0 5824/s 240 MiB/s 0 0 00:07:18.445 1,0 5824/s 240 MiB/s 0 0 00:07:18.445 ==================================================================================== 00:07:18.445 Total 23264/s 2468 MiB/s 0 0' 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.445 12:07:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.445 12:07:05 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.445 12:07:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.445 12:07:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.445 12:07:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.445 12:07:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.445 12:07:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.445 12:07:05 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.445 12:07:05 -- accel/accel.sh@42 -- # jq -r . 00:07:18.445 [2024-11-02 12:07:05.144419] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:18.445 [2024-11-02 12:07:05.144514] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1141463 ] 00:07:18.445 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.445 [2024-11-02 12:07:05.210758] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:18.445 [2024-11-02 12:07:05.247513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.445 [2024-11-02 12:07:05.247608] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.445 [2024-11-02 12:07:05.247691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:18.445 [2024-11-02 12:07:05.247693] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=0xf 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=decompress 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=software 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@23 -- # accel_module=software 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=32 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=32 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=1 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val=Yes 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:18.445 12:07:05 -- accel/accel.sh@21 -- # val= 00:07:18.445 12:07:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # IFS=: 00:07:18.445 12:07:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@21 -- # val= 00:07:19.825 12:07:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # IFS=: 00:07:19.825 12:07:06 -- accel/accel.sh@20 -- # read -r var val 00:07:19.825 12:07:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:19.825 12:07:06 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:19.825 12:07:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:19.825 00:07:19.825 real 0m2.620s 00:07:19.825 user 0m9.069s 00:07:19.825 sys 0m0.269s 00:07:19.825 12:07:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.825 12:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:19.826 ************************************ 00:07:19.826 END TEST accel_decomp_full_mcore 00:07:19.826 ************************************ 00:07:19.826 12:07:06 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:19.826 12:07:06 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:19.826 12:07:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.826 12:07:06 -- common/autotest_common.sh@10 -- # set +x 00:07:19.826 ************************************ 00:07:19.826 START TEST accel_decomp_mthread 00:07:19.826 ************************************ 00:07:19.826 12:07:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:19.826 12:07:06 -- accel/accel.sh@16 -- # local accel_opc 00:07:19.826 12:07:06 -- accel/accel.sh@17 -- # local accel_module 00:07:19.826 12:07:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:19.826 12:07:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:19.826 12:07:06 -- accel/accel.sh@12 -- # build_accel_config 00:07:19.826 12:07:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:19.826 12:07:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.826 12:07:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.826 12:07:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:19.826 12:07:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:19.826 12:07:06 -- accel/accel.sh@41 -- # local IFS=, 00:07:19.826 12:07:06 -- accel/accel.sh@42 -- # jq -r . 00:07:19.826 [2024-11-02 12:07:06.504818] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:19.826 [2024-11-02 12:07:06.504914] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1141707 ] 00:07:19.826 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.826 [2024-11-02 12:07:06.574262] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.826 [2024-11-02 12:07:06.610970] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.204 12:07:07 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:21.204 00:07:21.204 SPDK Configuration: 00:07:21.204 Core mask: 0x1 00:07:21.204 00:07:21.204 Accel Perf Configuration: 00:07:21.204 Workload Type: decompress 00:07:21.204 Transfer size: 4096 bytes 00:07:21.204 Vector count 1 00:07:21.204 Module: software 00:07:21.204 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.204 Queue depth: 32 00:07:21.204 Allocate depth: 32 00:07:21.204 # threads/core: 2 00:07:21.204 Run time: 1 seconds 00:07:21.204 Verify: Yes 00:07:21.204 00:07:21.204 Running for 1 seconds... 00:07:21.204 00:07:21.204 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:21.204 ------------------------------------------------------------------------------------ 00:07:21.204 0,1 47616/s 87 MiB/s 0 0 00:07:21.204 0,0 47456/s 87 MiB/s 0 0 00:07:21.204 ==================================================================================== 00:07:21.204 Total 95072/s 371 MiB/s 0 0' 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:21.204 12:07:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:21.204 12:07:07 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.204 12:07:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.204 12:07:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.204 12:07:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.204 12:07:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.204 12:07:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.204 12:07:07 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.204 12:07:07 -- accel/accel.sh@42 -- # jq -r . 00:07:21.204 [2024-11-02 12:07:07.797846] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:21.204 [2024-11-02 12:07:07.797940] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1141977 ] 00:07:21.204 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.204 [2024-11-02 12:07:07.864621] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.204 [2024-11-02 12:07:07.898566] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=0x1 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=decompress 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=software 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@23 -- # accel_module=software 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=32 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=32 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=2 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val=Yes 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:21.204 12:07:07 -- accel/accel.sh@21 -- # val= 00:07:21.204 12:07:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # IFS=: 00:07:21.204 12:07:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@21 -- # val= 00:07:22.142 12:07:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # IFS=: 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@21 -- # val= 00:07:22.142 12:07:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # IFS=: 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@21 -- # val= 00:07:22.142 12:07:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # IFS=: 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@21 -- # val= 00:07:22.142 12:07:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # IFS=: 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@21 -- # val= 00:07:22.142 12:07:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # IFS=: 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@21 -- # val= 00:07:22.142 12:07:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # IFS=: 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@21 -- # val= 00:07:22.142 12:07:09 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # IFS=: 00:07:22.142 12:07:09 -- accel/accel.sh@20 -- # read -r var val 00:07:22.142 12:07:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:22.142 12:07:09 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:22.142 12:07:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:22.142 00:07:22.142 real 0m2.587s 00:07:22.142 user 0m2.327s 00:07:22.142 sys 0m0.270s 00:07:22.142 12:07:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.142 12:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:22.142 ************************************ 00:07:22.142 END TEST accel_decomp_mthread 00:07:22.142 ************************************ 00:07:22.142 12:07:09 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.142 12:07:09 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:22.142 12:07:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:22.142 12:07:09 -- common/autotest_common.sh@10 -- # set +x 00:07:22.400 ************************************ 00:07:22.400 START TEST accel_deomp_full_mthread 00:07:22.400 ************************************ 00:07:22.400 12:07:09 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.400 12:07:09 -- accel/accel.sh@16 -- # local accel_opc 00:07:22.400 12:07:09 -- accel/accel.sh@17 -- # local accel_module 00:07:22.400 12:07:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.401 12:07:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.401 12:07:09 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.401 12:07:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.401 12:07:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.401 12:07:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.401 12:07:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.401 12:07:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.401 12:07:09 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.401 12:07:09 -- accel/accel.sh@42 -- # jq -r . 00:07:22.401 [2024-11-02 12:07:09.139076] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:22.401 [2024-11-02 12:07:09.139176] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1142259 ] 00:07:22.401 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.401 [2024-11-02 12:07:09.206419] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.401 [2024-11-02 12:07:09.241828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.775 12:07:10 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:23.775 00:07:23.775 SPDK Configuration: 00:07:23.775 Core mask: 0x1 00:07:23.775 00:07:23.775 Accel Perf Configuration: 00:07:23.775 Workload Type: decompress 00:07:23.775 Transfer size: 111250 bytes 00:07:23.775 Vector count 1 00:07:23.775 Module: software 00:07:23.775 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:23.775 Queue depth: 32 00:07:23.775 Allocate depth: 32 00:07:23.775 # threads/core: 2 00:07:23.775 Run time: 1 seconds 00:07:23.775 Verify: Yes 00:07:23.775 00:07:23.775 Running for 1 seconds... 00:07:23.775 00:07:23.775 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:23.775 ------------------------------------------------------------------------------------ 00:07:23.775 0,1 2944/s 121 MiB/s 0 0 00:07:23.775 0,0 2944/s 121 MiB/s 0 0 00:07:23.775 ==================================================================================== 00:07:23.775 Total 5888/s 624 MiB/s 0 0' 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.775 12:07:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.775 12:07:10 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.775 12:07:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.775 12:07:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.775 12:07:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.775 12:07:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.775 12:07:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.775 12:07:10 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.775 12:07:10 -- accel/accel.sh@42 -- # jq -r . 00:07:23.775 [2024-11-02 12:07:10.449934] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:23.775 [2024-11-02 12:07:10.450036] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1142527 ] 00:07:23.775 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.775 [2024-11-02 12:07:10.517406] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.775 [2024-11-02 12:07:10.551620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val=0x1 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val=decompress 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.775 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.775 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.775 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val=software 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@23 -- # accel_module=software 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val=32 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val=32 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val=2 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val=Yes 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:23.776 12:07:10 -- accel/accel.sh@21 -- # val= 00:07:23.776 12:07:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # IFS=: 00:07:23.776 12:07:10 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@21 -- # val= 00:07:25.152 12:07:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@21 -- # val= 00:07:25.152 12:07:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@21 -- # val= 00:07:25.152 12:07:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@21 -- # val= 00:07:25.152 12:07:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@21 -- # val= 00:07:25.152 12:07:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@21 -- # val= 00:07:25.152 12:07:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@21 -- # val= 00:07:25.152 12:07:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.152 12:07:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.152 12:07:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:25.152 12:07:11 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:25.152 12:07:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.152 00:07:25.152 real 0m2.625s 00:07:25.152 user 0m2.374s 00:07:25.152 sys 0m0.257s 00:07:25.152 12:07:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.152 12:07:11 -- common/autotest_common.sh@10 -- # set +x 00:07:25.152 ************************************ 00:07:25.152 END TEST accel_deomp_full_mthread 00:07:25.152 ************************************ 00:07:25.152 12:07:11 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:25.152 12:07:11 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:25.152 12:07:11 -- accel/accel.sh@129 -- # build_accel_config 00:07:25.152 12:07:11 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:25.152 12:07:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.152 12:07:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.152 12:07:11 -- common/autotest_common.sh@10 -- # set +x 00:07:25.152 12:07:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.152 12:07:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.152 12:07:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.152 12:07:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.152 12:07:11 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.152 12:07:11 -- accel/accel.sh@42 -- # jq -r . 00:07:25.152 ************************************ 00:07:25.152 START TEST accel_dif_functional_tests 00:07:25.152 ************************************ 00:07:25.152 12:07:11 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:25.152 [2024-11-02 12:07:11.816833] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:25.152 [2024-11-02 12:07:11.816928] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1142819 ] 00:07:25.153 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.153 [2024-11-02 12:07:11.882792] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:25.153 [2024-11-02 12:07:11.919169] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.153 [2024-11-02 12:07:11.919262] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:25.153 [2024-11-02 12:07:11.919264] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.153 00:07:25.153 00:07:25.153 CUnit - A unit testing framework for C - Version 2.1-3 00:07:25.153 http://cunit.sourceforge.net/ 00:07:25.153 00:07:25.153 00:07:25.153 Suite: accel_dif 00:07:25.153 Test: verify: DIF generated, GUARD check ...passed 00:07:25.153 Test: verify: DIF generated, APPTAG check ...passed 00:07:25.153 Test: verify: DIF generated, REFTAG check ...passed 00:07:25.153 Test: verify: DIF not generated, GUARD check ...[2024-11-02 12:07:11.981529] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:25.153 [2024-11-02 12:07:11.981582] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:25.153 passed 00:07:25.153 Test: verify: DIF not generated, APPTAG check ...[2024-11-02 12:07:11.981634] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:25.153 [2024-11-02 12:07:11.981653] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:25.153 passed 00:07:25.153 Test: verify: DIF not generated, REFTAG check ...[2024-11-02 12:07:11.981676] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:25.153 [2024-11-02 12:07:11.981694] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:25.153 passed 00:07:25.153 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:25.153 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-02 12:07:11.981737] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:25.153 passed 00:07:25.153 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:25.153 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:25.153 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:25.153 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-02 12:07:11.981836] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:25.153 passed 00:07:25.153 Test: generate copy: DIF generated, GUARD check ...passed 00:07:25.153 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:25.153 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:25.153 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:25.153 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:25.153 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:25.153 Test: generate copy: iovecs-len validate ...[2024-11-02 12:07:11.982020] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:25.153 passed 00:07:25.153 Test: generate copy: buffer alignment validate ...passed 00:07:25.153 00:07:25.153 Run Summary: Type Total Ran Passed Failed Inactive 00:07:25.153 suites 1 1 n/a 0 0 00:07:25.153 tests 20 20 20 0 0 00:07:25.153 asserts 204 204 204 0 n/a 00:07:25.153 00:07:25.153 Elapsed time = 0.002 seconds 00:07:25.412 00:07:25.412 real 0m0.337s 00:07:25.412 user 0m0.526s 00:07:25.412 sys 0m0.147s 00:07:25.412 12:07:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.412 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.412 ************************************ 00:07:25.412 END TEST accel_dif_functional_tests 00:07:25.412 ************************************ 00:07:25.412 00:07:25.412 real 0m55.285s 00:07:25.412 user 1m2.908s 00:07:25.412 sys 0m7.112s 00:07:25.412 12:07:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.412 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.412 ************************************ 00:07:25.412 END TEST accel 00:07:25.412 ************************************ 00:07:25.412 12:07:12 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:25.412 12:07:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:25.412 12:07:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.412 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.412 ************************************ 00:07:25.412 START TEST accel_rpc 00:07:25.412 ************************************ 00:07:25.412 12:07:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:25.412 * Looking for test storage... 00:07:25.412 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:25.412 12:07:12 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:25.412 12:07:12 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:25.412 12:07:12 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1142880 00:07:25.412 12:07:12 -- accel/accel_rpc.sh@15 -- # waitforlisten 1142880 00:07:25.412 12:07:12 -- common/autotest_common.sh@819 -- # '[' -z 1142880 ']' 00:07:25.412 12:07:12 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.412 12:07:12 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:25.412 12:07:12 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.412 12:07:12 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:25.412 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.412 [2024-11-02 12:07:12.339362] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:25.412 [2024-11-02 12:07:12.339420] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1142880 ] 00:07:25.412 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.671 [2024-11-02 12:07:12.404682] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.671 [2024-11-02 12:07:12.443845] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:25.671 [2024-11-02 12:07:12.443959] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.671 12:07:12 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:25.671 12:07:12 -- common/autotest_common.sh@852 -- # return 0 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:25.671 12:07:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:25.671 12:07:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.671 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.671 ************************************ 00:07:25.671 START TEST accel_assign_opcode 00:07:25.671 ************************************ 00:07:25.671 12:07:12 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:25.671 12:07:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:25.671 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.671 [2024-11-02 12:07:12.528494] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:25.671 12:07:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:25.671 12:07:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:25.671 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.671 [2024-11-02 12:07:12.536509] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:25.671 12:07:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:25.671 12:07:12 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:25.671 12:07:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:25.671 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.930 12:07:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:25.930 12:07:12 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:25.930 12:07:12 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:25.930 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.930 12:07:12 -- accel/accel_rpc.sh@42 -- # grep software 00:07:25.930 12:07:12 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:25.930 12:07:12 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:25.930 software 00:07:25.930 00:07:25.930 real 0m0.231s 00:07:25.930 user 0m0.044s 00:07:25.930 sys 0m0.008s 00:07:25.930 12:07:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.930 12:07:12 -- common/autotest_common.sh@10 -- # set +x 00:07:25.930 ************************************ 00:07:25.930 END TEST accel_assign_opcode 00:07:25.930 ************************************ 00:07:25.930 12:07:12 -- accel/accel_rpc.sh@55 -- # killprocess 1142880 00:07:25.930 12:07:12 -- common/autotest_common.sh@926 -- # '[' -z 1142880 ']' 00:07:25.930 12:07:12 -- common/autotest_common.sh@930 -- # kill -0 1142880 00:07:25.930 12:07:12 -- common/autotest_common.sh@931 -- # uname 00:07:25.930 12:07:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:25.930 12:07:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1142880 00:07:25.930 12:07:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:25.930 12:07:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:25.930 12:07:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1142880' 00:07:25.930 killing process with pid 1142880 00:07:25.930 12:07:12 -- common/autotest_common.sh@945 -- # kill 1142880 00:07:25.930 12:07:12 -- common/autotest_common.sh@950 -- # wait 1142880 00:07:26.189 00:07:26.189 real 0m0.919s 00:07:26.189 user 0m0.838s 00:07:26.189 sys 0m0.419s 00:07:26.189 12:07:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.189 12:07:13 -- common/autotest_common.sh@10 -- # set +x 00:07:26.189 ************************************ 00:07:26.189 END TEST accel_rpc 00:07:26.189 ************************************ 00:07:26.448 12:07:13 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:26.448 12:07:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:26.448 12:07:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:26.448 12:07:13 -- common/autotest_common.sh@10 -- # set +x 00:07:26.448 ************************************ 00:07:26.448 START TEST app_cmdline 00:07:26.448 ************************************ 00:07:26.448 12:07:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:26.448 * Looking for test storage... 00:07:26.448 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:26.448 12:07:13 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:26.448 12:07:13 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1143210 00:07:26.448 12:07:13 -- app/cmdline.sh@18 -- # waitforlisten 1143210 00:07:26.448 12:07:13 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:26.448 12:07:13 -- common/autotest_common.sh@819 -- # '[' -z 1143210 ']' 00:07:26.448 12:07:13 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.448 12:07:13 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:26.448 12:07:13 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.448 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.448 12:07:13 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:26.448 12:07:13 -- common/autotest_common.sh@10 -- # set +x 00:07:26.448 [2024-11-02 12:07:13.315322] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:26.448 [2024-11-02 12:07:13.315414] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1143210 ] 00:07:26.448 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.448 [2024-11-02 12:07:13.381407] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.448 [2024-11-02 12:07:13.419023] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.448 [2024-11-02 12:07:13.419136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.386 12:07:14 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:27.386 12:07:14 -- common/autotest_common.sh@852 -- # return 0 00:07:27.386 12:07:14 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:27.386 { 00:07:27.386 "version": "SPDK v24.01.1-pre git sha1 726a04d70", 00:07:27.386 "fields": { 00:07:27.386 "major": 24, 00:07:27.386 "minor": 1, 00:07:27.386 "patch": 1, 00:07:27.386 "suffix": "-pre", 00:07:27.386 "commit": "726a04d70" 00:07:27.386 } 00:07:27.386 } 00:07:27.386 12:07:14 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:27.386 12:07:14 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:27.386 12:07:14 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:27.386 12:07:14 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:27.386 12:07:14 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:27.386 12:07:14 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:27.386 12:07:14 -- common/autotest_common.sh@10 -- # set +x 00:07:27.386 12:07:14 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:27.386 12:07:14 -- app/cmdline.sh@26 -- # sort 00:07:27.386 12:07:14 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:27.386 12:07:14 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:27.386 12:07:14 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:27.386 12:07:14 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:27.386 12:07:14 -- common/autotest_common.sh@640 -- # local es=0 00:07:27.386 12:07:14 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:27.386 12:07:14 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:27.386 12:07:14 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:27.386 12:07:14 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:27.386 12:07:14 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:27.386 12:07:14 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:27.386 12:07:14 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:27.386 12:07:14 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:27.386 12:07:14 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:27.386 12:07:14 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:27.646 request: 00:07:27.646 { 00:07:27.646 "method": "env_dpdk_get_mem_stats", 00:07:27.646 "req_id": 1 00:07:27.646 } 00:07:27.646 Got JSON-RPC error response 00:07:27.646 response: 00:07:27.646 { 00:07:27.646 "code": -32601, 00:07:27.646 "message": "Method not found" 00:07:27.646 } 00:07:27.646 12:07:14 -- common/autotest_common.sh@643 -- # es=1 00:07:27.646 12:07:14 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:27.646 12:07:14 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:27.646 12:07:14 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:27.646 12:07:14 -- app/cmdline.sh@1 -- # killprocess 1143210 00:07:27.646 12:07:14 -- common/autotest_common.sh@926 -- # '[' -z 1143210 ']' 00:07:27.646 12:07:14 -- common/autotest_common.sh@930 -- # kill -0 1143210 00:07:27.646 12:07:14 -- common/autotest_common.sh@931 -- # uname 00:07:27.646 12:07:14 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:27.646 12:07:14 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1143210 00:07:27.646 12:07:14 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:27.646 12:07:14 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:27.646 12:07:14 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1143210' 00:07:27.646 killing process with pid 1143210 00:07:27.646 12:07:14 -- common/autotest_common.sh@945 -- # kill 1143210 00:07:27.646 12:07:14 -- common/autotest_common.sh@950 -- # wait 1143210 00:07:27.905 00:07:27.905 real 0m1.663s 00:07:27.905 user 0m1.938s 00:07:27.905 sys 0m0.476s 00:07:27.905 12:07:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.905 12:07:14 -- common/autotest_common.sh@10 -- # set +x 00:07:27.905 ************************************ 00:07:27.905 END TEST app_cmdline 00:07:27.905 ************************************ 00:07:28.165 12:07:14 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:28.165 12:07:14 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:28.165 12:07:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:28.165 12:07:14 -- common/autotest_common.sh@10 -- # set +x 00:07:28.165 ************************************ 00:07:28.165 START TEST version 00:07:28.165 ************************************ 00:07:28.165 12:07:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:28.165 * Looking for test storage... 00:07:28.165 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:28.165 12:07:15 -- app/version.sh@17 -- # get_header_version major 00:07:28.165 12:07:15 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:28.165 12:07:15 -- app/version.sh@14 -- # cut -f2 00:07:28.165 12:07:15 -- app/version.sh@14 -- # tr -d '"' 00:07:28.165 12:07:15 -- app/version.sh@17 -- # major=24 00:07:28.165 12:07:15 -- app/version.sh@18 -- # get_header_version minor 00:07:28.165 12:07:15 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:28.165 12:07:15 -- app/version.sh@14 -- # cut -f2 00:07:28.165 12:07:15 -- app/version.sh@14 -- # tr -d '"' 00:07:28.165 12:07:15 -- app/version.sh@18 -- # minor=1 00:07:28.165 12:07:15 -- app/version.sh@19 -- # get_header_version patch 00:07:28.165 12:07:15 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:28.165 12:07:15 -- app/version.sh@14 -- # cut -f2 00:07:28.165 12:07:15 -- app/version.sh@14 -- # tr -d '"' 00:07:28.165 12:07:15 -- app/version.sh@19 -- # patch=1 00:07:28.165 12:07:15 -- app/version.sh@20 -- # get_header_version suffix 00:07:28.165 12:07:15 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:28.165 12:07:15 -- app/version.sh@14 -- # cut -f2 00:07:28.165 12:07:15 -- app/version.sh@14 -- # tr -d '"' 00:07:28.165 12:07:15 -- app/version.sh@20 -- # suffix=-pre 00:07:28.165 12:07:15 -- app/version.sh@22 -- # version=24.1 00:07:28.165 12:07:15 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:28.165 12:07:15 -- app/version.sh@25 -- # version=24.1.1 00:07:28.165 12:07:15 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:28.165 12:07:15 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:28.165 12:07:15 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:28.165 12:07:15 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:28.165 12:07:15 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:28.165 00:07:28.165 real 0m0.172s 00:07:28.165 user 0m0.083s 00:07:28.165 sys 0m0.136s 00:07:28.165 12:07:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.165 12:07:15 -- common/autotest_common.sh@10 -- # set +x 00:07:28.165 ************************************ 00:07:28.165 END TEST version 00:07:28.165 ************************************ 00:07:28.165 12:07:15 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:28.165 12:07:15 -- spdk/autotest.sh@204 -- # uname -s 00:07:28.165 12:07:15 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:28.165 12:07:15 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:28.165 12:07:15 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:28.165 12:07:15 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:28.165 12:07:15 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:28.165 12:07:15 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:28.165 12:07:15 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:28.165 12:07:15 -- common/autotest_common.sh@10 -- # set +x 00:07:28.426 12:07:15 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:28.426 12:07:15 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:28.426 12:07:15 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:28.426 12:07:15 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:28.426 12:07:15 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:28.426 12:07:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:28.426 12:07:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:28.426 12:07:15 -- common/autotest_common.sh@10 -- # set +x 00:07:28.426 ************************************ 00:07:28.426 START TEST llvm_fuzz 00:07:28.426 ************************************ 00:07:28.426 12:07:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:28.426 * Looking for test storage... 00:07:28.426 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:28.426 12:07:15 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:28.426 12:07:15 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:28.426 12:07:15 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:28.426 12:07:15 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:28.426 12:07:15 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:28.426 12:07:15 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:28.427 12:07:15 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:28.427 12:07:15 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:28.427 12:07:15 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:28.427 12:07:15 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:28.427 12:07:15 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:28.427 12:07:15 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:28.427 12:07:15 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:28.427 12:07:15 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:28.427 12:07:15 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:28.427 12:07:15 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:28.427 12:07:15 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:28.427 12:07:15 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:28.427 12:07:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:28.427 12:07:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:28.427 12:07:15 -- common/autotest_common.sh@10 -- # set +x 00:07:28.427 ************************************ 00:07:28.427 START TEST nvmf_fuzz 00:07:28.427 ************************************ 00:07:28.427 12:07:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:28.427 * Looking for test storage... 00:07:28.427 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:28.427 12:07:15 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:28.427 12:07:15 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:28.427 12:07:15 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:28.427 12:07:15 -- common/autotest_common.sh@34 -- # set -e 00:07:28.427 12:07:15 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:28.427 12:07:15 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:28.427 12:07:15 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:28.427 12:07:15 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:28.427 12:07:15 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:28.427 12:07:15 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:28.427 12:07:15 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:28.427 12:07:15 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:28.427 12:07:15 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:28.427 12:07:15 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:28.427 12:07:15 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:28.427 12:07:15 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:28.427 12:07:15 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:28.427 12:07:15 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:28.427 12:07:15 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:28.427 12:07:15 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:28.427 12:07:15 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:28.427 12:07:15 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:28.427 12:07:15 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:28.427 12:07:15 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:28.427 12:07:15 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:28.427 12:07:15 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:28.427 12:07:15 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:28.427 12:07:15 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:28.427 12:07:15 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:28.427 12:07:15 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:28.427 12:07:15 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:28.427 12:07:15 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:28.427 12:07:15 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:28.427 12:07:15 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:28.427 12:07:15 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:28.427 12:07:15 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:28.427 12:07:15 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:28.427 12:07:15 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:28.427 12:07:15 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:28.427 12:07:15 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:28.427 12:07:15 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:28.427 12:07:15 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:28.427 12:07:15 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:28.427 12:07:15 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:28.427 12:07:15 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:28.427 12:07:15 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:28.427 12:07:15 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:28.427 12:07:15 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:28.427 12:07:15 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:28.427 12:07:15 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:28.427 12:07:15 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:28.427 12:07:15 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:28.427 12:07:15 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:28.427 12:07:15 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:28.427 12:07:15 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:28.427 12:07:15 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:28.427 12:07:15 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:28.427 12:07:15 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:28.427 12:07:15 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:28.427 12:07:15 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:28.427 12:07:15 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:28.427 12:07:15 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:28.427 12:07:15 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:28.427 12:07:15 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:28.427 12:07:15 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:28.427 12:07:15 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:28.427 12:07:15 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:28.427 12:07:15 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:28.427 12:07:15 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:28.427 12:07:15 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:28.427 12:07:15 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:28.427 12:07:15 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:28.427 12:07:15 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:28.427 12:07:15 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:28.427 12:07:15 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:28.427 12:07:15 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:28.427 12:07:15 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:28.427 12:07:15 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:28.427 12:07:15 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:28.427 12:07:15 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:28.427 12:07:15 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:28.427 12:07:15 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:28.427 12:07:15 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:28.427 12:07:15 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:28.427 12:07:15 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:28.427 12:07:15 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:28.427 12:07:15 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:28.427 12:07:15 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:28.427 12:07:15 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:28.689 12:07:15 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:28.689 12:07:15 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:28.689 12:07:15 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:28.689 12:07:15 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:28.689 12:07:15 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:28.689 12:07:15 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:28.689 12:07:15 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:28.689 12:07:15 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:28.689 12:07:15 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:28.689 12:07:15 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:28.689 12:07:15 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:28.689 12:07:15 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:28.689 12:07:15 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:28.689 12:07:15 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:28.689 #define SPDK_CONFIG_H 00:07:28.689 #define SPDK_CONFIG_APPS 1 00:07:28.689 #define SPDK_CONFIG_ARCH native 00:07:28.689 #undef SPDK_CONFIG_ASAN 00:07:28.689 #undef SPDK_CONFIG_AVAHI 00:07:28.689 #undef SPDK_CONFIG_CET 00:07:28.689 #define SPDK_CONFIG_COVERAGE 1 00:07:28.689 #define SPDK_CONFIG_CROSS_PREFIX 00:07:28.689 #undef SPDK_CONFIG_CRYPTO 00:07:28.689 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:28.689 #undef SPDK_CONFIG_CUSTOMOCF 00:07:28.689 #undef SPDK_CONFIG_DAOS 00:07:28.689 #define SPDK_CONFIG_DAOS_DIR 00:07:28.689 #define SPDK_CONFIG_DEBUG 1 00:07:28.689 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:28.689 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:28.689 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:28.689 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:28.689 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:28.689 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:28.689 #define SPDK_CONFIG_EXAMPLES 1 00:07:28.689 #undef SPDK_CONFIG_FC 00:07:28.689 #define SPDK_CONFIG_FC_PATH 00:07:28.689 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:28.689 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:28.689 #undef SPDK_CONFIG_FUSE 00:07:28.689 #define SPDK_CONFIG_FUZZER 1 00:07:28.689 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:28.689 #undef SPDK_CONFIG_GOLANG 00:07:28.689 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:28.689 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:28.689 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:28.689 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:28.689 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:28.689 #define SPDK_CONFIG_IDXD 1 00:07:28.689 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:28.689 #undef SPDK_CONFIG_IPSEC_MB 00:07:28.689 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:28.689 #define SPDK_CONFIG_ISAL 1 00:07:28.689 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:28.689 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:28.689 #define SPDK_CONFIG_LIBDIR 00:07:28.689 #undef SPDK_CONFIG_LTO 00:07:28.689 #define SPDK_CONFIG_MAX_LCORES 00:07:28.689 #define SPDK_CONFIG_NVME_CUSE 1 00:07:28.689 #undef SPDK_CONFIG_OCF 00:07:28.689 #define SPDK_CONFIG_OCF_PATH 00:07:28.689 #define SPDK_CONFIG_OPENSSL_PATH 00:07:28.689 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:28.689 #undef SPDK_CONFIG_PGO_USE 00:07:28.689 #define SPDK_CONFIG_PREFIX /usr/local 00:07:28.689 #undef SPDK_CONFIG_RAID5F 00:07:28.689 #undef SPDK_CONFIG_RBD 00:07:28.689 #define SPDK_CONFIG_RDMA 1 00:07:28.689 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:28.689 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:28.689 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:28.689 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:28.689 #undef SPDK_CONFIG_SHARED 00:07:28.689 #undef SPDK_CONFIG_SMA 00:07:28.689 #define SPDK_CONFIG_TESTS 1 00:07:28.689 #undef SPDK_CONFIG_TSAN 00:07:28.689 #define SPDK_CONFIG_UBLK 1 00:07:28.689 #define SPDK_CONFIG_UBSAN 1 00:07:28.689 #undef SPDK_CONFIG_UNIT_TESTS 00:07:28.689 #undef SPDK_CONFIG_URING 00:07:28.689 #define SPDK_CONFIG_URING_PATH 00:07:28.689 #undef SPDK_CONFIG_URING_ZNS 00:07:28.689 #undef SPDK_CONFIG_USDT 00:07:28.689 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:28.689 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:28.689 #define SPDK_CONFIG_VFIO_USER 1 00:07:28.689 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:28.689 #define SPDK_CONFIG_VHOST 1 00:07:28.689 #define SPDK_CONFIG_VIRTIO 1 00:07:28.689 #undef SPDK_CONFIG_VTUNE 00:07:28.689 #define SPDK_CONFIG_VTUNE_DIR 00:07:28.689 #define SPDK_CONFIG_WERROR 1 00:07:28.689 #define SPDK_CONFIG_WPDK_DIR 00:07:28.689 #undef SPDK_CONFIG_XNVME 00:07:28.689 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:28.689 12:07:15 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:28.689 12:07:15 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:28.689 12:07:15 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:28.689 12:07:15 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:28.689 12:07:15 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:28.689 12:07:15 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:28.689 12:07:15 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:28.689 12:07:15 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:28.689 12:07:15 -- paths/export.sh@5 -- # export PATH 00:07:28.689 12:07:15 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:28.689 12:07:15 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:28.689 12:07:15 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:28.689 12:07:15 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:28.689 12:07:15 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:28.689 12:07:15 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:28.689 12:07:15 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:28.689 12:07:15 -- pm/common@16 -- # TEST_TAG=N/A 00:07:28.689 12:07:15 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:28.689 12:07:15 -- common/autotest_common.sh@52 -- # : 1 00:07:28.689 12:07:15 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:28.689 12:07:15 -- common/autotest_common.sh@56 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:28.689 12:07:15 -- common/autotest_common.sh@58 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:28.689 12:07:15 -- common/autotest_common.sh@60 -- # : 1 00:07:28.689 12:07:15 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:28.689 12:07:15 -- common/autotest_common.sh@62 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:28.689 12:07:15 -- common/autotest_common.sh@64 -- # : 00:07:28.689 12:07:15 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:28.689 12:07:15 -- common/autotest_common.sh@66 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:28.689 12:07:15 -- common/autotest_common.sh@68 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:28.689 12:07:15 -- common/autotest_common.sh@70 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:28.689 12:07:15 -- common/autotest_common.sh@72 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:28.689 12:07:15 -- common/autotest_common.sh@74 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:28.689 12:07:15 -- common/autotest_common.sh@76 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:28.689 12:07:15 -- common/autotest_common.sh@78 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:28.689 12:07:15 -- common/autotest_common.sh@80 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:28.689 12:07:15 -- common/autotest_common.sh@82 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:28.689 12:07:15 -- common/autotest_common.sh@84 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:28.689 12:07:15 -- common/autotest_common.sh@86 -- # : 0 00:07:28.689 12:07:15 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:28.690 12:07:15 -- common/autotest_common.sh@88 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:28.690 12:07:15 -- common/autotest_common.sh@90 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:28.690 12:07:15 -- common/autotest_common.sh@92 -- # : 1 00:07:28.690 12:07:15 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:28.690 12:07:15 -- common/autotest_common.sh@94 -- # : 1 00:07:28.690 12:07:15 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:28.690 12:07:15 -- common/autotest_common.sh@96 -- # : rdma 00:07:28.690 12:07:15 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:28.690 12:07:15 -- common/autotest_common.sh@98 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:28.690 12:07:15 -- common/autotest_common.sh@100 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:28.690 12:07:15 -- common/autotest_common.sh@102 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:28.690 12:07:15 -- common/autotest_common.sh@104 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:28.690 12:07:15 -- common/autotest_common.sh@106 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:28.690 12:07:15 -- common/autotest_common.sh@108 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:28.690 12:07:15 -- common/autotest_common.sh@110 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:28.690 12:07:15 -- common/autotest_common.sh@112 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:28.690 12:07:15 -- common/autotest_common.sh@114 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:28.690 12:07:15 -- common/autotest_common.sh@116 -- # : 1 00:07:28.690 12:07:15 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:28.690 12:07:15 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:28.690 12:07:15 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:28.690 12:07:15 -- common/autotest_common.sh@120 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:28.690 12:07:15 -- common/autotest_common.sh@122 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:28.690 12:07:15 -- common/autotest_common.sh@124 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:28.690 12:07:15 -- common/autotest_common.sh@126 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:28.690 12:07:15 -- common/autotest_common.sh@128 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:28.690 12:07:15 -- common/autotest_common.sh@130 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:28.690 12:07:15 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:28.690 12:07:15 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:28.690 12:07:15 -- common/autotest_common.sh@134 -- # : true 00:07:28.690 12:07:15 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:28.690 12:07:15 -- common/autotest_common.sh@136 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:28.690 12:07:15 -- common/autotest_common.sh@138 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:28.690 12:07:15 -- common/autotest_common.sh@140 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:28.690 12:07:15 -- common/autotest_common.sh@142 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:28.690 12:07:15 -- common/autotest_common.sh@144 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:28.690 12:07:15 -- common/autotest_common.sh@146 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:28.690 12:07:15 -- common/autotest_common.sh@148 -- # : 00:07:28.690 12:07:15 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:28.690 12:07:15 -- common/autotest_common.sh@150 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:28.690 12:07:15 -- common/autotest_common.sh@152 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:28.690 12:07:15 -- common/autotest_common.sh@154 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:28.690 12:07:15 -- common/autotest_common.sh@156 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:28.690 12:07:15 -- common/autotest_common.sh@158 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:28.690 12:07:15 -- common/autotest_common.sh@160 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:28.690 12:07:15 -- common/autotest_common.sh@163 -- # : 00:07:28.690 12:07:15 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:28.690 12:07:15 -- common/autotest_common.sh@165 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:28.690 12:07:15 -- common/autotest_common.sh@167 -- # : 0 00:07:28.690 12:07:15 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:28.690 12:07:15 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:28.690 12:07:15 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:28.690 12:07:15 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:28.690 12:07:15 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:28.690 12:07:15 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:28.690 12:07:15 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:28.690 12:07:15 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:28.690 12:07:15 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:28.690 12:07:15 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:28.690 12:07:15 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:28.690 12:07:15 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:28.690 12:07:15 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:28.690 12:07:15 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:28.690 12:07:15 -- common/autotest_common.sh@196 -- # cat 00:07:28.690 12:07:15 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:28.690 12:07:15 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:28.690 12:07:15 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:28.690 12:07:15 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:28.690 12:07:15 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:28.690 12:07:15 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:28.690 12:07:15 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:28.690 12:07:15 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:28.690 12:07:15 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:28.690 12:07:15 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:28.690 12:07:15 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:28.691 12:07:15 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:28.691 12:07:15 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:28.691 12:07:15 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:28.691 12:07:15 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:28.691 12:07:15 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:28.691 12:07:15 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:28.691 12:07:15 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:28.691 12:07:15 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:28.691 12:07:15 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:28.691 12:07:15 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:28.691 12:07:15 -- common/autotest_common.sh@249 -- # valgrind= 00:07:28.691 12:07:15 -- common/autotest_common.sh@255 -- # uname -s 00:07:28.691 12:07:15 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:28.691 12:07:15 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:28.691 12:07:15 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:28.691 12:07:15 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:28.691 12:07:15 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:28.691 12:07:15 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:28.691 12:07:15 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:28.691 12:07:15 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:28.691 12:07:15 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:28.691 12:07:15 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:28.691 12:07:15 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:28.691 12:07:15 -- common/autotest_common.sh@309 -- # [[ -z 1143636 ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@309 -- # kill -0 1143636 00:07:28.691 12:07:15 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:28.691 12:07:15 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:28.691 12:07:15 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:28.691 12:07:15 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:28.691 12:07:15 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:28.691 12:07:15 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:28.691 12:07:15 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:28.691 12:07:15 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.NdIkc8 00:07:28.691 12:07:15 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:28.691 12:07:15 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.NdIkc8/tests/nvmf /tmp/spdk.NdIkc8 00:07:28.691 12:07:15 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@318 -- # df -T 00:07:28.691 12:07:15 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:28.691 12:07:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=4096 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:28.691 12:07:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=5284425728 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=52171829248 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61730603008 00:07:28.691 12:07:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=9558773760 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=30862708736 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865301504 00:07:28.691 12:07:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=12340129792 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12346122240 00:07:28.691 12:07:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=5992448 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=30864347136 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865301504 00:07:28.691 12:07:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=954368 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # avails["$mount"]=6173044736 00:07:28.691 12:07:15 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6173057024 00:07:28.691 12:07:15 -- common/autotest_common.sh@354 -- # uses["$mount"]=12288 00:07:28.691 12:07:15 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:28.691 12:07:15 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:28.691 * Looking for test storage... 00:07:28.691 12:07:15 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:28.691 12:07:15 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:28.691 12:07:15 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:28.691 12:07:15 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:28.691 12:07:15 -- common/autotest_common.sh@363 -- # mount=/ 00:07:28.691 12:07:15 -- common/autotest_common.sh@365 -- # target_space=52171829248 00:07:28.691 12:07:15 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:28.691 12:07:15 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:28.691 12:07:15 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@372 -- # new_size=11773366272 00:07:28.691 12:07:15 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:28.691 12:07:15 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:28.691 12:07:15 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:28.691 12:07:15 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:28.691 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:28.691 12:07:15 -- common/autotest_common.sh@380 -- # return 0 00:07:28.691 12:07:15 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:28.691 12:07:15 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:28.691 12:07:15 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:28.691 12:07:15 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:28.691 12:07:15 -- common/autotest_common.sh@1672 -- # true 00:07:28.691 12:07:15 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:28.691 12:07:15 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:28.691 12:07:15 -- common/autotest_common.sh@27 -- # exec 00:07:28.691 12:07:15 -- common/autotest_common.sh@29 -- # exec 00:07:28.691 12:07:15 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:28.691 12:07:15 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:28.691 12:07:15 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:28.691 12:07:15 -- common/autotest_common.sh@18 -- # set -x 00:07:28.691 12:07:15 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:28.691 12:07:15 -- ../common.sh@8 -- # pids=() 00:07:28.691 12:07:15 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:28.691 12:07:15 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:28.691 12:07:15 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:28.691 12:07:15 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:28.691 12:07:15 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:28.691 12:07:15 -- nvmf/run.sh@61 -- # mem_size=512 00:07:28.691 12:07:15 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:28.692 12:07:15 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:28.692 12:07:15 -- ../common.sh@69 -- # local fuzz_num=25 00:07:28.692 12:07:15 -- ../common.sh@70 -- # local time=1 00:07:28.692 12:07:15 -- ../common.sh@72 -- # (( i = 0 )) 00:07:28.692 12:07:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:28.692 12:07:15 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:28.692 12:07:15 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:28.692 12:07:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:28.692 12:07:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:28.692 12:07:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:28.692 12:07:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:28.692 12:07:15 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:28.692 12:07:15 -- nvmf/run.sh@29 -- # port=4400 00:07:28.692 12:07:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:28.692 12:07:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:28.692 12:07:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:28.692 12:07:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:28.692 [2024-11-02 12:07:15.593812] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:28.692 [2024-11-02 12:07:15.593907] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1143676 ] 00:07:28.692 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.951 [2024-11-02 12:07:15.849766] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.951 [2024-11-02 12:07:15.876820] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:28.951 [2024-11-02 12:07:15.876960] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.210 [2024-11-02 12:07:15.928224] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:29.210 [2024-11-02 12:07:15.944610] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:29.210 INFO: Running with entropic power schedule (0xFF, 100). 00:07:29.210 INFO: Seed: 1259642110 00:07:29.210 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:29.210 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:29.210 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:29.210 INFO: A corpus is not provided, starting from an empty corpus 00:07:29.210 #2 INITED exec/s: 0 rss: 59Mb 00:07:29.210 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:29.210 This may also happen if the target rejected all inputs we tried so far 00:07:29.210 [2024-11-02 12:07:15.989689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (72) qid:0 cid:4 nsid:72727272 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7272727272727272 00:07:29.210 [2024-11-02 12:07:15.989723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.470 NEW_FUNC[1/670]: 0x451418 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:29.470 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:29.470 #4 NEW cov: 11550 ft: 11551 corp: 2/88b lim: 320 exec/s: 0 rss: 67Mb L: 87/87 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:29.470 [2024-11-02 12:07:16.280492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.470 [2024-11-02 12:07:16.280525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.470 NEW_FUNC[1/1]: 0x12de638 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:29.470 #9 NEW cov: 11694 ft: 11988 corp: 3/213b lim: 320 exec/s: 0 rss: 68Mb L: 125/125 MS: 5 InsertByte-InsertRepeatedBytes-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:07:29.470 [2024-11-02 12:07:16.320520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (72) qid:0 cid:4 nsid:72727272 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7272727272727272 00:07:29.470 [2024-11-02 12:07:16.320548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.470 #10 NEW cov: 11700 ft: 12322 corp: 4/302b lim: 320 exec/s: 0 rss: 68Mb L: 89/125 MS: 1 CrossOver- 00:07:29.470 [2024-11-02 12:07:16.360673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.470 [2024-11-02 12:07:16.360699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.470 #11 NEW cov: 11785 ft: 12632 corp: 5/427b lim: 320 exec/s: 0 rss: 68Mb L: 125/125 MS: 1 ChangeBit- 00:07:29.470 [2024-11-02 12:07:16.400796] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.470 [2024-11-02 12:07:16.400822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.470 #13 NEW cov: 11802 ft: 12765 corp: 6/543b lim: 320 exec/s: 0 rss: 68Mb L: 116/125 MS: 2 CopyPart-CrossOver- 00:07:29.470 [2024-11-02 12:07:16.440985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.470 [2024-11-02 12:07:16.441015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.729 #14 NEW cov: 11802 ft: 12870 corp: 7/659b lim: 320 exec/s: 0 rss: 68Mb L: 116/125 MS: 1 ChangeASCIIInt- 00:07:29.729 [2024-11-02 12:07:16.481250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.729 [2024-11-02 12:07:16.481275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.729 [2024-11-02 12:07:16.481341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.729 [2024-11-02 12:07:16.481355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.729 [2024-11-02 12:07:16.481410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:29.729 [2024-11-02 12:07:16.481427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.729 #15 NEW cov: 11804 ft: 13210 corp: 8/867b lim: 320 exec/s: 0 rss: 68Mb L: 208/208 MS: 1 InsertRepeatedBytes- 00:07:29.729 [2024-11-02 12:07:16.521163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.729 [2024-11-02 12:07:16.521188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.729 #16 NEW cov: 11804 ft: 13253 corp: 9/983b lim: 320 exec/s: 0 rss: 68Mb L: 116/208 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\004"- 00:07:29.729 [2024-11-02 12:07:16.561321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.729 [2024-11-02 12:07:16.561346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.729 #17 NEW cov: 11804 ft: 13330 corp: 10/1099b lim: 320 exec/s: 0 rss: 68Mb L: 116/208 MS: 1 ChangeASCIIInt- 00:07:29.729 [2024-11-02 12:07:16.601566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:46ffffff cdw10:ffff4646 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:29.729 [2024-11-02 12:07:16.601592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.729 [2024-11-02 12:07:16.601668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffdfff 00:07:29.729 [2024-11-02 12:07:16.601682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.729 [2024-11-02 12:07:16.601744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.729 [2024-11-02 12:07:16.601758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.729 #18 NEW cov: 11804 ft: 13458 corp: 11/1342b lim: 320 exec/s: 0 rss: 68Mb L: 243/243 MS: 1 InsertRepeatedBytes- 00:07:29.729 [2024-11-02 12:07:16.641562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.729 [2024-11-02 12:07:16.641587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.729 #19 NEW cov: 11804 ft: 13483 corp: 12/1458b lim: 320 exec/s: 0 rss: 68Mb L: 116/243 MS: 1 ShuffleBytes- 00:07:29.729 [2024-11-02 12:07:16.681663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (72) qid:0 cid:4 nsid:72727272 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7272727272727272 00:07:29.730 [2024-11-02 12:07:16.681688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.730 #20 NEW cov: 11804 ft: 13507 corp: 13/1545b lim: 320 exec/s: 0 rss: 68Mb L: 87/243 MS: 1 CopyPart- 00:07:29.989 [2024-11-02 12:07:16.721836] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.989 [2024-11-02 12:07:16.721862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.989 [2024-11-02 12:07:16.721938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffdfffffff 00:07:29.989 [2024-11-02 12:07:16.721952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.989 #21 NEW cov: 11804 ft: 13693 corp: 14/1697b lim: 320 exec/s: 0 rss: 69Mb L: 152/243 MS: 1 InsertRepeatedBytes- 00:07:29.989 [2024-11-02 12:07:16.761998] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.989 [2024-11-02 12:07:16.762025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.989 [2024-11-02 12:07:16.762089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffdf 00:07:29.989 [2024-11-02 12:07:16.762103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.989 #22 NEW cov: 11804 ft: 13713 corp: 15/1846b lim: 320 exec/s: 0 rss: 69Mb L: 149/243 MS: 1 CopyPart- 00:07:29.989 [2024-11-02 12:07:16.802028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.989 [2024-11-02 12:07:16.802055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.989 #23 NEW cov: 11804 ft: 13748 corp: 16/1962b lim: 320 exec/s: 0 rss: 69Mb L: 116/243 MS: 1 ChangeASCIIInt- 00:07:29.989 [2024-11-02 12:07:16.842116] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.989 [2024-11-02 12:07:16.842142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.989 #24 NEW cov: 11804 ft: 13769 corp: 17/2045b lim: 320 exec/s: 0 rss: 69Mb L: 83/243 MS: 1 EraseBytes- 00:07:29.989 [2024-11-02 12:07:16.872202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (72) qid:0 cid:4 nsid:72727272 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7272727272727272 00:07:29.989 [2024-11-02 12:07:16.872228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.989 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:29.989 #25 NEW cov: 11827 ft: 13872 corp: 18/2132b lim: 320 exec/s: 0 rss: 69Mb L: 87/243 MS: 1 ShuffleBytes- 00:07:29.989 [2024-11-02 12:07:16.912431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.989 [2024-11-02 12:07:16.912457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.989 [2024-11-02 12:07:16.912522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffdfffffff 00:07:29.989 [2024-11-02 12:07:16.912536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.989 #26 NEW cov: 11827 ft: 13895 corp: 19/2284b lim: 320 exec/s: 0 rss: 69Mb L: 152/243 MS: 1 ChangeASCIIInt- 00:07:29.989 [2024-11-02 12:07:16.952466] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:29.989 [2024-11-02 12:07:16.952493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.249 #27 NEW cov: 11827 ft: 13901 corp: 20/2401b lim: 320 exec/s: 0 rss: 69Mb L: 117/243 MS: 1 InsertByte- 00:07:30.249 [2024-11-02 12:07:16.982725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.249 [2024-11-02 12:07:16.982751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.249 [2024-11-02 12:07:16.982832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.249 [2024-11-02 12:07:16.982849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.249 [2024-11-02 12:07:16.982903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.249 [2024-11-02 12:07:16.982917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.249 #28 NEW cov: 11827 ft: 13910 corp: 21/2609b lim: 320 exec/s: 28 rss: 69Mb L: 208/243 MS: 1 ChangeBit- 00:07:30.249 [2024-11-02 12:07:17.022679] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.249 [2024-11-02 12:07:17.022705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.249 #29 NEW cov: 11827 ft: 13922 corp: 22/2725b lim: 320 exec/s: 29 rss: 69Mb L: 116/243 MS: 1 ChangeByte- 00:07:30.249 [2024-11-02 12:07:17.052711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:0002ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.249 [2024-11-02 12:07:17.052734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.249 [2024-11-02 12:07:17.052755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.249 [2024-11-02 12:07:17.052765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.249 [2024-11-02 12:07:17.052781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.249 [2024-11-02 12:07:17.052791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.249 #30 NEW cov: 11827 ft: 13978 corp: 23/2933b lim: 320 exec/s: 30 rss: 69Mb L: 208/243 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:07:30.249 [2024-11-02 12:07:17.082840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:dfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.249 [2024-11-02 12:07:17.082866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.249 #31 NEW cov: 11827 ft: 14016 corp: 24/3037b lim: 320 exec/s: 31 rss: 69Mb L: 104/243 MS: 1 EraseBytes- 00:07:30.249 [2024-11-02 12:07:17.113117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:46bfffff cdw10:ffff4646 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:30.249 [2024-11-02 12:07:17.113144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.249 [2024-11-02 12:07:17.113208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffdfff 00:07:30.249 [2024-11-02 12:07:17.113222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.249 [2024-11-02 12:07:17.113299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.249 [2024-11-02 12:07:17.113314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.249 #32 NEW cov: 11827 ft: 14031 corp: 25/3280b lim: 320 exec/s: 32 rss: 69Mb L: 243/243 MS: 1 ChangeBit- 00:07:30.249 [2024-11-02 12:07:17.153090] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.249 [2024-11-02 12:07:17.153115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.249 #33 NEW cov: 11827 ft: 14064 corp: 26/3404b lim: 320 exec/s: 33 rss: 69Mb L: 124/243 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:30.250 [2024-11-02 12:07:17.193200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (30) qid:0 cid:4 nsid:4e4e4e4e cdw10:4e4e4e4e cdw11:4e4e4e4e 00:07:30.250 [2024-11-02 12:07:17.193225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.250 #36 NEW cov: 11827 ft: 14082 corp: 27/3487b lim: 320 exec/s: 36 rss: 69Mb L: 83/243 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:30.250 [2024-11-02 12:07:17.223371] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.250 [2024-11-02 12:07:17.223397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.250 [2024-11-02 12:07:17.223456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.250 [2024-11-02 12:07:17.223470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.529 #37 NEW cov: 11827 ft: 14091 corp: 28/3647b lim: 320 exec/s: 37 rss: 69Mb L: 160/243 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\004"- 00:07:30.529 [2024-11-02 12:07:17.263545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (91) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:30.529 [2024-11-02 12:07:17.263570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.529 [2024-11-02 12:07:17.263631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.529 [2024-11-02 12:07:17.263644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.529 NEW_FUNC[1/1]: 0x16dd468 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:30.529 #42 NEW cov: 11840 ft: 14403 corp: 29/3810b lim: 320 exec/s: 42 rss: 69Mb L: 163/243 MS: 5 ChangeByte-InsertByte-EraseBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:07:30.529 [2024-11-02 12:07:17.303526] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.529 [2024-11-02 12:07:17.303551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.529 #43 NEW cov: 11840 ft: 14432 corp: 30/3927b lim: 320 exec/s: 43 rss: 69Mb L: 117/243 MS: 1 ChangeBit- 00:07:30.529 [2024-11-02 12:07:17.343742] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.529 [2024-11-02 12:07:17.343768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.529 [2024-11-02 12:07:17.343830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffdfffffff 00:07:30.529 [2024-11-02 12:07:17.343844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.529 #44 NEW cov: 11840 ft: 14452 corp: 31/4079b lim: 320 exec/s: 44 rss: 69Mb L: 152/243 MS: 1 ChangeByte- 00:07:30.529 [2024-11-02 12:07:17.383734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (30) qid:0 cid:4 nsid:4e4e4e4e cdw10:4e4e4e4e cdw11:4e4e4e4e 00:07:30.529 [2024-11-02 12:07:17.383759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.529 #45 NEW cov: 11840 ft: 14532 corp: 32/4188b lim: 320 exec/s: 45 rss: 69Mb L: 109/243 MS: 1 InsertRepeatedBytes- 00:07:30.529 [2024-11-02 12:07:17.423899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.529 [2024-11-02 12:07:17.423925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.529 #46 NEW cov: 11840 ft: 14548 corp: 33/4305b lim: 320 exec/s: 46 rss: 69Mb L: 117/243 MS: 1 ChangeASCIIInt- 00:07:30.529 [2024-11-02 12:07:17.464321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:46ffffff cdw10:ffff4646 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:30.529 [2024-11-02 12:07:17.464347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.529 [2024-11-02 12:07:17.464408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffdfff 00:07:30.529 [2024-11-02 12:07:17.464421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.529 [2024-11-02 12:07:17.464482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff 00:07:30.529 [2024-11-02 12:07:17.464496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.529 [2024-11-02 12:07:17.464547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:30.529 [2024-11-02 12:07:17.464560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.789 #47 NEW cov: 11840 ft: 14787 corp: 34/4562b lim: 320 exec/s: 47 rss: 70Mb L: 257/257 MS: 1 CopyPart- 00:07:30.789 [2024-11-02 12:07:17.504207] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.504234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.504295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.504310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.789 #48 NEW cov: 11840 ft: 14793 corp: 35/4722b lim: 320 exec/s: 48 rss: 70Mb L: 160/257 MS: 1 ChangeBinInt- 00:07:30.789 [2024-11-02 12:07:17.544313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.544338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.544399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.544413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.789 #49 NEW cov: 11840 ft: 14806 corp: 36/4882b lim: 320 exec/s: 49 rss: 70Mb L: 160/257 MS: 1 ChangeBit- 00:07:30.789 [2024-11-02 12:07:17.584417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.584442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.584504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.584521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.789 #50 NEW cov: 11840 ft: 14808 corp: 37/5023b lim: 320 exec/s: 50 rss: 70Mb L: 141/257 MS: 1 EraseBytes- 00:07:30.789 [2024-11-02 12:07:17.624425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.624451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.789 #51 NEW cov: 11840 ft: 14822 corp: 38/5139b lim: 320 exec/s: 51 rss: 70Mb L: 116/257 MS: 1 ChangeASCIIInt- 00:07:30.789 [2024-11-02 12:07:17.654782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:46ffffff cdw10:ffff4646 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x4646464646464646 00:07:30.789 [2024-11-02 12:07:17.654808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.654887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffdfff 00:07:30.789 [2024-11-02 12:07:17.654902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.654963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffdfffffff 00:07:30.789 [2024-11-02 12:07:17.654977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.655032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffffff00 00:07:30.789 [2024-11-02 12:07:17.655046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.789 #52 NEW cov: 11840 ft: 14883 corp: 39/5404b lim: 320 exec/s: 52 rss: 70Mb L: 265/265 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:30.789 [2024-11-02 12:07:17.694854] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.694881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.694960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.694974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.695036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.695050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.734975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:30.789 [2024-11-02 12:07:17.735007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.735067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffff 00:07:30.789 [2024-11-02 12:07:17.735081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.789 [2024-11-02 12:07:17.735136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:46464646 cdw11:46464646 00:07:30.789 [2024-11-02 12:07:17.735149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.789 #55 NEW cov: 11840 ft: 14932 corp: 40/5598b lim: 320 exec/s: 55 rss: 70Mb L: 194/265 MS: 3 CopyPart-CrossOver-CrossOver- 00:07:31.049 [2024-11-02 12:07:17.774958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.049 [2024-11-02 12:07:17.774984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.049 [2024-11-02 12:07:17.775053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.049 [2024-11-02 12:07:17.775067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.049 #56 NEW cov: 11840 ft: 14941 corp: 41/5766b lim: 320 exec/s: 56 rss: 70Mb L: 168/265 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\004"- 00:07:31.049 [2024-11-02 12:07:17.815030] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.049 [2024-11-02 12:07:17.815057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.049 #57 NEW cov: 11840 ft: 14967 corp: 42/5882b lim: 320 exec/s: 57 rss: 70Mb L: 116/265 MS: 1 ShuffleBytes- 00:07:31.049 [2024-11-02 12:07:17.855202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.049 [2024-11-02 12:07:17.855228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.049 [2024-11-02 12:07:17.855290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffff80ffffff 00:07:31.049 [2024-11-02 12:07:17.855304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.049 #58 NEW cov: 11840 ft: 14970 corp: 43/6059b lim: 320 exec/s: 58 rss: 70Mb L: 177/265 MS: 1 CopyPart- 00:07:31.049 [2024-11-02 12:07:17.895371] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.049 [2024-11-02 12:07:17.895397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.049 [2024-11-02 12:07:17.895460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffdfffffff 00:07:31.049 [2024-11-02 12:07:17.895474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.049 #59 NEW cov: 11840 ft: 14977 corp: 44/6211b lim: 320 exec/s: 59 rss: 70Mb L: 152/265 MS: 1 ChangeASCIIInt- 00:07:31.049 [2024-11-02 12:07:17.935360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.050 [2024-11-02 12:07:17.935385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.050 #60 NEW cov: 11840 ft: 14989 corp: 45/6327b lim: 320 exec/s: 60 rss: 70Mb L: 116/265 MS: 1 ChangeBit- 00:07:31.050 [2024-11-02 12:07:17.975561] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.050 [2024-11-02 12:07:17.975590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.050 [2024-11-02 12:07:17.975652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.050 [2024-11-02 12:07:17.975666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.050 #61 NEW cov: 11840 ft: 15010 corp: 46/6487b lim: 320 exec/s: 30 rss: 70Mb L: 160/265 MS: 1 ShuffleBytes- 00:07:31.050 #61 DONE cov: 11840 ft: 15010 corp: 46/6487b lim: 320 exec/s: 30 rss: 70Mb 00:07:31.050 ###### Recommended dictionary. ###### 00:07:31.050 "\000\000\000\000\000\000\000\004" # Uses: 2 00:07:31.050 "\002\000\000\000\000\000\000\000" # Uses: 2 00:07:31.050 ###### End of recommended dictionary. ###### 00:07:31.050 Done 61 runs in 2 second(s) 00:07:31.310 12:07:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:31.310 12:07:18 -- ../common.sh@72 -- # (( i++ )) 00:07:31.310 12:07:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:31.310 12:07:18 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:31.310 12:07:18 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:31.310 12:07:18 -- nvmf/run.sh@24 -- # local timen=1 00:07:31.310 12:07:18 -- nvmf/run.sh@25 -- # local core=0x1 00:07:31.310 12:07:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:31.310 12:07:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:31.310 12:07:18 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:31.310 12:07:18 -- nvmf/run.sh@29 -- # port=4401 00:07:31.310 12:07:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:31.310 12:07:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:31.310 12:07:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:31.310 12:07:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:31.310 [2024-11-02 12:07:18.148414] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:31.310 [2024-11-02 12:07:18.148484] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1144223 ] 00:07:31.310 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.597 [2024-11-02 12:07:18.397606] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.597 [2024-11-02 12:07:18.424679] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.597 [2024-11-02 12:07:18.424793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.597 [2024-11-02 12:07:18.476019] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:31.597 [2024-11-02 12:07:18.492394] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:31.597 INFO: Running with entropic power schedule (0xFF, 100). 00:07:31.597 INFO: Seed: 3807619460 00:07:31.597 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:31.597 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:31.597 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:31.597 INFO: A corpus is not provided, starting from an empty corpus 00:07:31.597 #2 INITED exec/s: 0 rss: 59Mb 00:07:31.597 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:31.597 This may also happen if the target rejected all inputs we tried so far 00:07:31.597 [2024-11-02 12:07:18.537396] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:31.597 [2024-11-02 12:07:18.537518] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:31.597 [2024-11-02 12:07:18.537623] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:31.597 [2024-11-02 12:07:18.537823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.597 [2024-11-02 12:07:18.537853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.597 [2024-11-02 12:07:18.537907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.597 [2024-11-02 12:07:18.537922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.597 [2024-11-02 12:07:18.537972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.597 [2024-11-02 12:07:18.537986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.880 NEW_FUNC[1/671]: 0x451d18 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:31.880 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:31.880 #4 NEW cov: 11620 ft: 11621 corp: 2/24b lim: 30 exec/s: 0 rss: 67Mb L: 23/23 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:32.139 [2024-11-02 12:07:18.858981] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.139 [2024-11-02 12:07:18.859169] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.139 [2024-11-02 12:07:18.859549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.859601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.139 [2024-11-02 12:07:18.859733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.859758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.139 #5 NEW cov: 11739 ft: 12665 corp: 3/37b lim: 30 exec/s: 0 rss: 67Mb L: 13/23 MS: 1 EraseBytes- 00:07:32.139 [2024-11-02 12:07:18.909099] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (796944) > buf size (4096) 00:07:32.139 [2024-11-02 12:07:18.909534] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.139 [2024-11-02 12:07:18.909883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.909913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.139 [2024-11-02 12:07:18.910042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.910060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.139 [2024-11-02 12:07:18.910182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.910200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.139 [2024-11-02 12:07:18.910325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.910347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.139 #6 NEW cov: 11785 ft: 13386 corp: 4/64b lim: 30 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:32.139 [2024-11-02 12:07:18.959032] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.139 [2024-11-02 12:07:18.959235] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.139 [2024-11-02 12:07:18.959581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.959613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.139 [2024-11-02 12:07:18.959748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff8343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.959767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.139 #7 NEW cov: 11870 ft: 13716 corp: 5/81b lim: 30 exec/s: 0 rss: 67Mb L: 17/27 MS: 1 InsertRepeatedBytes- 00:07:32.139 [2024-11-02 12:07:18.998871] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4343 00:07:32.139 [2024-11-02 12:07:18.999056] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (68612) > buf size (4096) 00:07:32.139 [2024-11-02 12:07:18.999363] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.139 [2024-11-02 12:07:18.999739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a430000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.999771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.139 [2024-11-02 12:07:18.999902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.139 [2024-11-02 12:07:18.999921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.139 [2024-11-02 12:07:19.000052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.000071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.140 [2024-11-02 12:07:19.000196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.000215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.140 #8 NEW cov: 11870 ft: 13843 corp: 6/108b lim: 30 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 ShuffleBytes- 00:07:32.140 [2024-11-02 12:07:19.049511] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x43 00:07:32.140 [2024-11-02 12:07:19.049691] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (68880) > buf size (4096) 00:07:32.140 [2024-11-02 12:07:19.049986] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.140 [2024-11-02 12:07:19.050332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a430043 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.050361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.140 [2024-11-02 12:07:19.050472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43430000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.050491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.140 [2024-11-02 12:07:19.050622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.050641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.140 [2024-11-02 12:07:19.050765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00008343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.050784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.140 #9 NEW cov: 11870 ft: 13980 corp: 7/136b lim: 30 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 CrossOver- 00:07:32.140 [2024-11-02 12:07:19.099658] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (796944) > buf size (4096) 00:07:32.140 [2024-11-02 12:07:19.100477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.100508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.140 [2024-11-02 12:07:19.100631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.100652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.140 [2024-11-02 12:07:19.100781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.100801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.140 [2024-11-02 12:07:19.100939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.140 [2024-11-02 12:07:19.100957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.399 #10 NEW cov: 11870 ft: 14047 corp: 8/163b lim: 30 exec/s: 0 rss: 67Mb L: 27/28 MS: 1 CopyPart- 00:07:32.399 [2024-11-02 12:07:19.149913] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.399 [2024-11-02 12:07:19.150098] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.399 [2024-11-02 12:07:19.150259] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.399 [2024-11-02 12:07:19.150408] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.399 [2024-11-02 12:07:19.150558] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:32.399 [2024-11-02 12:07:19.150905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.399 [2024-11-02 12:07:19.150935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.399 [2024-11-02 12:07:19.151055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.399 [2024-11-02 12:07:19.151075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.399 [2024-11-02 12:07:19.151205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.399 [2024-11-02 12:07:19.151225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.399 [2024-11-02 12:07:19.151346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.399 [2024-11-02 12:07:19.151367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.399 [2024-11-02 12:07:19.151493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.399 [2024-11-02 12:07:19.151510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:32.399 #11 NEW cov: 11870 ft: 14177 corp: 9/193b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:32.399 [2024-11-02 12:07:19.199840] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.399 [2024-11-02 12:07:19.200031] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.399 [2024-11-02 12:07:19.200364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.399 [2024-11-02 12:07:19.200396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.399 [2024-11-02 12:07:19.200522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:430d8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.399 [2024-11-02 12:07:19.200542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.399 #12 NEW cov: 11870 ft: 14226 corp: 10/206b lim: 30 exec/s: 0 rss: 67Mb L: 13/30 MS: 1 ChangeBinInt- 00:07:32.399 [2024-11-02 12:07:19.250004] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.400 [2024-11-02 12:07:19.250204] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.400 [2024-11-02 12:07:19.250589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c0a8343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.250620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.400 [2024-11-02 12:07:19.250741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.250760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.400 #18 NEW cov: 11870 ft: 14260 corp: 11/220b lim: 30 exec/s: 0 rss: 67Mb L: 14/30 MS: 1 InsertByte- 00:07:32.400 [2024-11-02 12:07:19.289934] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (796944) > buf size (4096) 00:07:32.400 [2024-11-02 12:07:19.290720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.290751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.400 [2024-11-02 12:07:19.290881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.290900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.400 [2024-11-02 12:07:19.291032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00100000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.291050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.400 [2024-11-02 12:07:19.291169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.291189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.400 #19 NEW cov: 11870 ft: 14283 corp: 12/247b lim: 30 exec/s: 0 rss: 67Mb L: 27/30 MS: 1 ChangeBit- 00:07:32.400 [2024-11-02 12:07:19.350319] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.400 [2024-11-02 12:07:19.350491] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.400 [2024-11-02 12:07:19.350817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c0a8343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.350846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.400 [2024-11-02 12:07:19.350973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.400 [2024-11-02 12:07:19.350991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.659 #20 NEW cov: 11870 ft: 14374 corp: 13/261b lim: 30 exec/s: 0 rss: 67Mb L: 14/30 MS: 1 ChangeByte- 00:07:32.659 [2024-11-02 12:07:19.400575] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.659 [2024-11-02 12:07:19.400748] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.659 [2024-11-02 12:07:19.400894] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4343 00:07:32.659 [2024-11-02 12:07:19.401268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.401298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.659 [2024-11-02 12:07:19.401424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.401442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.659 [2024-11-02 12:07:19.401570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:43430017 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.401591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.659 #21 NEW cov: 11870 ft: 14427 corp: 14/284b lim: 30 exec/s: 0 rss: 67Mb L: 23/30 MS: 1 ChangeBinInt- 00:07:32.659 [2024-11-02 12:07:19.450542] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000043 00:07:32.659 [2024-11-02 12:07:19.450931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.450963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.659 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:32.659 #22 NEW cov: 11893 ft: 14877 corp: 15/293b lim: 30 exec/s: 0 rss: 68Mb L: 9/30 MS: 1 EraseBytes- 00:07:32.659 [2024-11-02 12:07:19.510776] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.659 [2024-11-02 12:07:19.510962] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.659 [2024-11-02 12:07:19.511328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.511360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.659 [2024-11-02 12:07:19.511484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:4343833b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.511503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.659 #23 NEW cov: 11893 ft: 14935 corp: 16/306b lim: 30 exec/s: 23 rss: 68Mb L: 13/30 MS: 1 ChangeBinInt- 00:07:32.659 [2024-11-02 12:07:19.560955] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.659 [2024-11-02 12:07:19.561156] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.659 [2024-11-02 12:07:19.561473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.561504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.659 [2024-11-02 12:07:19.561636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:3b438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.561653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.659 #24 NEW cov: 11893 ft: 15017 corp: 17/319b lim: 30 exec/s: 24 rss: 68Mb L: 13/30 MS: 1 CrossOver- 00:07:32.659 [2024-11-02 12:07:19.611175] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000043 00:07:32.659 [2024-11-02 12:07:19.611355] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1f 00:07:32.659 [2024-11-02 12:07:19.611684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.611724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.659 [2024-11-02 12:07:19.611850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43430001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.659 [2024-11-02 12:07:19.611870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.918 #25 NEW cov: 11893 ft: 15024 corp: 18/332b lim: 30 exec/s: 25 rss: 68Mb L: 13/30 MS: 1 CMP- DE: "\001\000\000\037"- 00:07:32.918 [2024-11-02 12:07:19.671489] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.918 [2024-11-02 12:07:19.671672] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.918 [2024-11-02 12:07:19.671839] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.918 [2024-11-02 12:07:19.672021] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.918 [2024-11-02 12:07:19.672390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.918 [2024-11-02 12:07:19.672422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.918 [2024-11-02 12:07:19.672552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.918 [2024-11-02 12:07:19.672573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.918 [2024-11-02 12:07:19.672705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.918 [2024-11-02 12:07:19.672724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.918 [2024-11-02 12:07:19.672855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.918 [2024-11-02 12:07:19.672879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.918 #26 NEW cov: 11893 ft: 15057 corp: 19/360b lim: 30 exec/s: 26 rss: 68Mb L: 28/30 MS: 1 EraseBytes- 00:07:32.918 [2024-11-02 12:07:19.731553] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004381 00:07:32.918 [2024-11-02 12:07:19.731740] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.918 [2024-11-02 12:07:19.732099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.918 [2024-11-02 12:07:19.732130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.732261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:4343833b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.732279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.919 #27 NEW cov: 11893 ft: 15142 corp: 20/373b lim: 30 exec/s: 27 rss: 68Mb L: 13/30 MS: 1 ChangeByte- 00:07:32.919 [2024-11-02 12:07:19.791918] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.919 [2024-11-02 12:07:19.792115] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1fff 00:07:32.919 [2024-11-02 12:07:19.792270] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.919 [2024-11-02 12:07:19.792427] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.919 [2024-11-02 12:07:19.792588] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:32.919 [2024-11-02 12:07:19.792957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.792988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.793139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.793158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.793299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.793316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.793440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.793460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.793590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.793609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:32.919 #28 NEW cov: 11893 ft: 15165 corp: 21/403b lim: 30 exec/s: 28 rss: 68Mb L: 30/30 MS: 1 PersAutoDict- DE: "\001\000\000\037"- 00:07:32.919 [2024-11-02 12:07:19.831747] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.919 [2024-11-02 12:07:19.831928] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.919 [2024-11-02 12:07:19.832322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.832358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.832486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.832505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.919 #29 NEW cov: 11893 ft: 15181 corp: 22/416b lim: 30 exec/s: 29 rss: 68Mb L: 13/30 MS: 1 CopyPart- 00:07:32.919 [2024-11-02 12:07:19.882019] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.919 [2024-11-02 12:07:19.882199] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:32.919 [2024-11-02 12:07:19.882345] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:32.919 [2024-11-02 12:07:19.882666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a4383ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.882697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.882819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.882836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.919 [2024-11-02 12:07:19.882976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:43438300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.919 [2024-11-02 12:07:19.882998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.178 #30 NEW cov: 11893 ft: 15221 corp: 23/435b lim: 30 exec/s: 30 rss: 68Mb L: 19/30 MS: 1 InsertRepeatedBytes- 00:07:33.178 [2024-11-02 12:07:19.921712] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.178 [2024-11-02 12:07:19.921900] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.178 [2024-11-02 12:07:19.922056] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (68880) > buf size (4096) 00:07:33.178 [2024-11-02 12:07:19.922435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.922464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:19.922598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.922618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:19.922740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:43430017 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.922760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.178 #31 NEW cov: 11893 ft: 15259 corp: 24/458b lim: 30 exec/s: 31 rss: 68Mb L: 23/30 MS: 1 PersAutoDict- DE: "\001\000\000\037"- 00:07:33.178 [2024-11-02 12:07:19.962025] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.178 [2024-11-02 12:07:19.962186] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1fff 00:07:33.178 [2024-11-02 12:07:19.962350] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.178 [2024-11-02 12:07:19.962507] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.178 [2024-11-02 12:07:19.962648] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:33.178 [2024-11-02 12:07:19.963011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.963041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:19.963169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff040000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.963186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:19.963302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.963321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:19.963440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.963458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:19.963586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:19.963604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.178 #32 NEW cov: 11893 ft: 15275 corp: 25/488b lim: 30 exec/s: 32 rss: 68Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:33.178 [2024-11-02 12:07:20.023172] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.178 [2024-11-02 12:07:20.023385] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.178 [2024-11-02 12:07:20.023768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:70438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.023801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:20.023935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:433b8343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.023953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.178 #33 NEW cov: 11893 ft: 15369 corp: 26/502b lim: 30 exec/s: 33 rss: 68Mb L: 14/30 MS: 1 InsertByte- 00:07:33.178 [2024-11-02 12:07:20.082871] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4343 00:07:33.178 [2024-11-02 12:07:20.083050] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (68612) > buf size (4096) 00:07:33.178 [2024-11-02 12:07:20.083362] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.178 [2024-11-02 12:07:20.083714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a430000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.083747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:20.083862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.083881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:20.084013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.084035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:20.084153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.084171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.178 #34 NEW cov: 11893 ft: 15405 corp: 27/530b lim: 30 exec/s: 34 rss: 68Mb L: 28/30 MS: 1 CrossOver- 00:07:33.178 [2024-11-02 12:07:20.122882] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x43 00:07:33.178 [2024-11-02 12:07:20.123065] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (68880) > buf size (4096) 00:07:33.178 [2024-11-02 12:07:20.123397] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004342 00:07:33.178 [2024-11-02 12:07:20.123760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a430043 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.123792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:20.123915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43430000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.123937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:20.124068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.124086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.178 [2024-11-02 12:07:20.124185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00008343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.178 [2024-11-02 12:07:20.124203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.178 #35 NEW cov: 11893 ft: 15417 corp: 28/558b lim: 30 exec/s: 35 rss: 68Mb L: 28/30 MS: 1 ChangeBit- 00:07:33.438 [2024-11-02 12:07:20.183160] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.438 [2024-11-02 12:07:20.183325] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.438 [2024-11-02 12:07:20.183483] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f3f3 00:07:33.438 [2024-11-02 12:07:20.183635] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f3f3 00:07:33.438 [2024-11-02 12:07:20.183786] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f3f3 00:07:33.438 [2024-11-02 12:07:20.184118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:70438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.184159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.184290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:433b8343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.184307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.184404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:432883f3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.184420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.184552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:f3f383f3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.184575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.184700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:f3f383f3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.184721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.438 #36 NEW cov: 11893 ft: 15445 corp: 29/588b lim: 30 exec/s: 36 rss: 68Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:33.438 [2024-11-02 12:07:20.243309] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.438 [2024-11-02 12:07:20.243489] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.438 [2024-11-02 12:07:20.243639] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (68880) > buf size (4096) 00:07:33.438 [2024-11-02 12:07:20.243793] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000fe1f 00:07:33.438 [2024-11-02 12:07:20.244188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.244220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.244341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.244360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.244492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:43430017 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.244510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.244629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00fe02fe cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.244648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.438 #37 NEW cov: 11893 ft: 15458 corp: 30/615b lim: 30 exec/s: 37 rss: 69Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:33.438 [2024-11-02 12:07:20.303615] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.438 [2024-11-02 12:07:20.303796] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.438 [2024-11-02 12:07:20.303958] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.438 [2024-11-02 12:07:20.304116] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.438 [2024-11-02 12:07:20.304278] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:33.438 [2024-11-02 12:07:20.304642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.304673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.304803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.304823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.304925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.304947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.305007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.305018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.305034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.305044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.438 #38 NEW cov: 11902 ft: 15534 corp: 31/645b lim: 30 exec/s: 38 rss: 69Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:33.438 [2024-11-02 12:07:20.353521] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.438 [2024-11-02 12:07:20.353680] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:33.438 [2024-11-02 12:07:20.353831] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.438 [2024-11-02 12:07:20.354181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a4383ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.354211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.354342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.354363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.438 [2024-11-02 12:07:20.354496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:43438300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.438 [2024-11-02 12:07:20.354515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.438 #39 NEW cov: 11902 ft: 15545 corp: 32/665b lim: 30 exec/s: 39 rss: 69Mb L: 20/30 MS: 1 InsertByte- 00:07:33.698 [2024-11-02 12:07:20.413590] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (68880) > buf size (4096) 00:07:33.698 [2024-11-02 12:07:20.413767] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.698 [2024-11-02 12:07:20.414128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:43430043 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.414159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.698 [2024-11-02 12:07:20.414284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:50438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.414302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.698 #40 NEW cov: 11902 ft: 15555 corp: 33/682b lim: 30 exec/s: 40 rss: 69Mb L: 17/30 MS: 1 InsertRepeatedBytes- 00:07:33.698 [2024-11-02 12:07:20.463759] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004329 00:07:33.698 [2024-11-02 12:07:20.463942] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.698 [2024-11-02 12:07:20.464298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c0a8343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.464329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.698 [2024-11-02 12:07:20.464443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.464463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.698 #41 NEW cov: 11902 ft: 15565 corp: 34/696b lim: 30 exec/s: 41 rss: 69Mb L: 14/30 MS: 1 ChangeByte- 00:07:33.698 [2024-11-02 12:07:20.524098] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (831532) > buf size (4096) 00:07:33.698 [2024-11-02 12:07:20.524280] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (331024) > buf size (4096) 00:07:33.698 [2024-11-02 12:07:20.524438] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.698 [2024-11-02 12:07:20.524599] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004343 00:07:33.698 [2024-11-02 12:07:20.524954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c0a8343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.524987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.698 [2024-11-02 12:07:20.525122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:43438143 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.525140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.698 [2024-11-02 12:07:20.525263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1f28830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.525283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.698 [2024-11-02 12:07:20.525412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:43438343 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.698 [2024-11-02 12:07:20.525432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.698 #42 NEW cov: 11902 ft: 15579 corp: 35/723b lim: 30 exec/s: 21 rss: 69Mb L: 27/30 MS: 1 CrossOver- 00:07:33.698 #42 DONE cov: 11902 ft: 15579 corp: 35/723b lim: 30 exec/s: 21 rss: 69Mb 00:07:33.698 ###### Recommended dictionary. ###### 00:07:33.698 "\001\000\000\037" # Uses: 2 00:07:33.698 ###### End of recommended dictionary. ###### 00:07:33.698 Done 42 runs in 2 second(s) 00:07:33.698 12:07:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:33.698 12:07:20 -- ../common.sh@72 -- # (( i++ )) 00:07:33.698 12:07:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.698 12:07:20 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:33.698 12:07:20 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:33.698 12:07:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.698 12:07:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.698 12:07:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:33.698 12:07:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:33.698 12:07:20 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:33.698 12:07:20 -- nvmf/run.sh@29 -- # port=4402 00:07:33.698 12:07:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:33.957 12:07:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:33.957 12:07:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:33.957 12:07:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:33.957 [2024-11-02 12:07:20.705974] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:33.957 [2024-11-02 12:07:20.706048] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1144611 ] 00:07:33.957 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.216 [2024-11-02 12:07:20.957128] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.216 [2024-11-02 12:07:20.984497] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:34.216 [2024-11-02 12:07:20.984634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.216 [2024-11-02 12:07:21.035900] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:34.216 [2024-11-02 12:07:21.052276] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:34.216 INFO: Running with entropic power schedule (0xFF, 100). 00:07:34.216 INFO: Seed: 2071668260 00:07:34.216 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:34.216 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:34.216 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:34.216 INFO: A corpus is not provided, starting from an empty corpus 00:07:34.217 #2 INITED exec/s: 0 rss: 59Mb 00:07:34.217 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:34.217 This may also happen if the target rejected all inputs we tried so far 00:07:34.217 [2024-11-02 12:07:21.128940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.217 [2024-11-02 12:07:21.128983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.476 NEW_FUNC[1/669]: 0x454738 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:34.476 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.476 #3 NEW cov: 11582 ft: 11583 corp: 2/10b lim: 35 exec/s: 0 rss: 66Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\007"- 00:07:34.476 [2024-11-02 12:07:21.439665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.476 [2024-11-02 12:07:21.439723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.476 [2024-11-02 12:07:21.439883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.476 [2024-11-02 12:07:21.439910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.476 [2024-11-02 12:07:21.440070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.476 [2024-11-02 12:07:21.440099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.736 NEW_FUNC[1/1]: 0x1723ef8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:07:34.736 #4 NEW cov: 11697 ft: 12469 corp: 3/33b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:07:34.736 [2024-11-02 12:07:21.498899] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.736 [2024-11-02 12:07:21.499282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.499322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.736 #15 NEW cov: 11712 ft: 12720 corp: 4/46b lim: 35 exec/s: 0 rss: 67Mb L: 13/23 MS: 1 InsertRepeatedBytes- 00:07:34.736 [2024-11-02 12:07:21.538907] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.736 [2024-11-02 12:07:21.539438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.539472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.539595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.539620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.539754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.539772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.736 #16 NEW cov: 11797 ft: 13018 corp: 5/69b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 CrossOver- 00:07:34.736 [2024-11-02 12:07:21.579143] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.736 [2024-11-02 12:07:21.579657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.579687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.579817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:07000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.579841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.579967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.579987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.736 #17 NEW cov: 11797 ft: 13113 corp: 6/92b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\007"- 00:07:34.736 [2024-11-02 12:07:21.629638] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.736 [2024-11-02 12:07:21.629805] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.736 [2024-11-02 12:07:21.629975] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.736 [2024-11-02 12:07:21.630334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.630364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.630494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.630522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.630654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.630680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.630809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.630836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.736 #18 NEW cov: 11797 ft: 13746 corp: 7/125b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 CopyPart- 00:07:34.736 [2024-11-02 12:07:21.680186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.680217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.680343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.680363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.736 [2024-11-02 12:07:21.680483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.736 [2024-11-02 12:07:21.680500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.736 #19 NEW cov: 11797 ft: 13984 corp: 8/148b lim: 35 exec/s: 0 rss: 67Mb L: 23/33 MS: 1 ChangeBinInt- 00:07:34.995 [2024-11-02 12:07:21.729999] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.996 [2024-11-02 12:07:21.730658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.730688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.730819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.730839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.730962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.730984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.731072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0a00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.731091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.996 #20 NEW cov: 11797 ft: 14125 corp: 9/180b lim: 35 exec/s: 0 rss: 67Mb L: 32/33 MS: 1 EraseBytes- 00:07:34.996 [2024-11-02 12:07:21.790046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.790074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.996 #21 NEW cov: 11797 ft: 14172 corp: 10/190b lim: 35 exec/s: 0 rss: 67Mb L: 10/33 MS: 1 InsertByte- 00:07:34.996 [2024-11-02 12:07:21.840335] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.996 [2024-11-02 12:07:21.840989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.841020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.841131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.841161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.841284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.841301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.841417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0a00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.841433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.996 #22 NEW cov: 11797 ft: 14217 corp: 11/222b lim: 35 exec/s: 0 rss: 67Mb L: 32/33 MS: 1 ChangeBit- 00:07:34.996 [2024-11-02 12:07:21.900098] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.996 [2024-11-02 12:07:21.900470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.900505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.996 #23 NEW cov: 11797 ft: 14258 corp: 12/229b lim: 35 exec/s: 0 rss: 67Mb L: 7/33 MS: 1 EraseBytes- 00:07:34.996 [2024-11-02 12:07:21.950186] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.996 [2024-11-02 12:07:21.950365] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.996 [2024-11-02 12:07:21.950531] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:34.996 [2024-11-02 12:07:21.950889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.950920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.951044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.951065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.951189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.951210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.996 [2024-11-02 12:07:21.951327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.996 [2024-11-02 12:07:21.951349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.255 #24 NEW cov: 11797 ft: 14336 corp: 13/262b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 CMP- DE: "\016\000\000\000\000\000\000\000"- 00:07:35.255 [2024-11-02 12:07:22.001265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.255 [2024-11-02 12:07:22.001295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.255 [2024-11-02 12:07:22.001412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.255 [2024-11-02 12:07:22.001430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.255 [2024-11-02 12:07:22.001558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.255 [2024-11-02 12:07:22.001576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.255 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.256 #25 NEW cov: 11820 ft: 14422 corp: 14/285b lim: 35 exec/s: 0 rss: 68Mb L: 23/33 MS: 1 CMP- DE: "\000\000\000\000\000\000\004\000"- 00:07:35.256 [2024-11-02 12:07:22.061049] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.256 [2024-11-02 12:07:22.061719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.061749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.256 [2024-11-02 12:07:22.061881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.061905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.256 [2024-11-02 12:07:22.062038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.062057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.256 [2024-11-02 12:07:22.062173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0a00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.062190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.256 #26 NEW cov: 11820 ft: 14443 corp: 15/318b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertByte- 00:07:35.256 [2024-11-02 12:07:22.120792] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.256 [2024-11-02 12:07:22.121167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000e0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.121205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.256 #32 NEW cov: 11820 ft: 14458 corp: 16/331b lim: 35 exec/s: 32 rss: 68Mb L: 13/33 MS: 1 PersAutoDict- DE: "\016\000\000\000\000\000\000\000"- 00:07:35.256 [2024-11-02 12:07:22.171258] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.256 [2024-11-02 12:07:22.171635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.171668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.256 [2024-11-02 12:07:22.171790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.171813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.256 #33 NEW cov: 11820 ft: 14654 corp: 17/346b lim: 35 exec/s: 33 rss: 68Mb L: 15/33 MS: 1 EraseBytes- 00:07:35.256 [2024-11-02 12:07:22.221546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.256 [2024-11-02 12:07:22.221577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.516 #34 NEW cov: 11820 ft: 14679 corp: 18/355b lim: 35 exec/s: 34 rss: 68Mb L: 9/33 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:07:35.516 [2024-11-02 12:07:22.271686] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.516 [2024-11-02 12:07:22.272371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.272403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.272532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.272558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.272684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.272705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.272834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0a00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.272855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.516 #35 NEW cov: 11820 ft: 14723 corp: 19/387b lim: 35 exec/s: 35 rss: 68Mb L: 32/33 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:07:35.516 [2024-11-02 12:07:22.322490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.322518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.322649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.322666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.322794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.322812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.322942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:0400ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.322960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.516 #36 NEW cov: 11820 ft: 14739 corp: 20/415b lim: 35 exec/s: 36 rss: 68Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:07:35.516 [2024-11-02 12:07:22.382022] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.516 [2024-11-02 12:07:22.382340] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.516 [2024-11-02 12:07:22.382690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.382720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.382850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.382876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.383014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.383036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.383150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.383170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.516 #37 NEW cov: 11820 ft: 14784 corp: 21/447b lim: 35 exec/s: 37 rss: 68Mb L: 32/33 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\007"- 00:07:35.516 [2024-11-02 12:07:22.422045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00003100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.422075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.516 #38 NEW cov: 11820 ft: 14814 corp: 22/457b lim: 35 exec/s: 38 rss: 68Mb L: 10/33 MS: 1 ShuffleBytes- 00:07:35.516 [2024-11-02 12:07:22.461685] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.516 [2024-11-02 12:07:22.462195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.462226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.462353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.462376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.516 [2024-11-02 12:07:22.462508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.516 [2024-11-02 12:07:22.462527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.516 #39 NEW cov: 11820 ft: 14900 corp: 23/481b lim: 35 exec/s: 39 rss: 68Mb L: 24/33 MS: 1 EraseBytes- 00:07:35.776 [2024-11-02 12:07:22.511944] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.776 [2024-11-02 12:07:22.512328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.512363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.776 #40 NEW cov: 11820 ft: 14926 corp: 24/494b lim: 35 exec/s: 40 rss: 68Mb L: 13/33 MS: 1 ChangeByte- 00:07:35.776 [2024-11-02 12:07:22.562500] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.776 [2024-11-02 12:07:22.562694] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.776 [2024-11-02 12:07:22.562865] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.776 [2024-11-02 12:07:22.563239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.563271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.776 [2024-11-02 12:07:22.563397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.563423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.776 [2024-11-02 12:07:22.563551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.563577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.776 [2024-11-02 12:07:22.563706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.563733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.776 #41 NEW cov: 11820 ft: 14989 corp: 25/527b lim: 35 exec/s: 41 rss: 68Mb L: 33/33 MS: 1 ChangeBinInt- 00:07:35.776 [2024-11-02 12:07:22.613017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.613043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.776 [2024-11-02 12:07:22.613184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffc4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.613205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.776 [2024-11-02 12:07:22.613341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.613360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.776 #42 NEW cov: 11820 ft: 15037 corp: 26/550b lim: 35 exec/s: 42 rss: 68Mb L: 23/33 MS: 1 ChangeByte- 00:07:35.776 [2024-11-02 12:07:22.662759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.662790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.776 #43 NEW cov: 11820 ft: 15038 corp: 27/557b lim: 35 exec/s: 43 rss: 68Mb L: 7/33 MS: 1 CrossOver- 00:07:35.776 [2024-11-02 12:07:22.712860] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:35.776 [2024-11-02 12:07:22.713380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.713413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.776 [2024-11-02 12:07:22.713543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.713565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.776 [2024-11-02 12:07:22.713689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.776 [2024-11-02 12:07:22.713708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.776 #44 NEW cov: 11820 ft: 15050 corp: 28/580b lim: 35 exec/s: 44 rss: 68Mb L: 23/33 MS: 1 ChangeByte- 00:07:36.036 [2024-11-02 12:07:22.762739] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.036 [2024-11-02 12:07:22.763106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.763145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.036 #45 NEW cov: 11820 ft: 15054 corp: 29/587b lim: 35 exec/s: 45 rss: 68Mb L: 7/33 MS: 1 CrossOver- 00:07:36.036 [2024-11-02 12:07:22.813313] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.036 [2024-11-02 12:07:22.814012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.814043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.036 [2024-11-02 12:07:22.814165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.814186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.036 [2024-11-02 12:07:22.814314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.814335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.036 [2024-11-02 12:07:22.814455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0a00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.814473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.036 #46 NEW cov: 11820 ft: 15077 corp: 30/619b lim: 35 exec/s: 46 rss: 68Mb L: 32/33 MS: 1 CrossOver- 00:07:36.036 [2024-11-02 12:07:22.853270] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.036 [2024-11-02 12:07:22.853458] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.036 [2024-11-02 12:07:22.853833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.853862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.036 [2024-11-02 12:07:22.853982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.854008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.036 [2024-11-02 12:07:22.854129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.854159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.036 #47 NEW cov: 11820 ft: 15087 corp: 31/642b lim: 35 exec/s: 47 rss: 68Mb L: 23/33 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\004\000"- 00:07:36.036 [2024-11-02 12:07:22.892988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.893020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.036 #48 NEW cov: 11820 ft: 15116 corp: 32/649b lim: 35 exec/s: 48 rss: 68Mb L: 7/33 MS: 1 CrossOver- 00:07:36.036 [2024-11-02 12:07:22.943675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.943707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.036 #49 NEW cov: 11820 ft: 15129 corp: 33/659b lim: 35 exec/s: 49 rss: 68Mb L: 10/33 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\007"- 00:07:36.036 [2024-11-02 12:07:22.983211] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.036 [2024-11-02 12:07:22.983616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000e0000 cdw11:000000f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.036 [2024-11-02 12:07:22.983653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.036 #50 NEW cov: 11820 ft: 15131 corp: 34/672b lim: 35 exec/s: 50 rss: 69Mb L: 13/33 MS: 1 ChangeBinInt- 00:07:36.296 [2024-11-02 12:07:23.023303] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.296 [2024-11-02 12:07:23.023651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.296 [2024-11-02 12:07:23.023682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.296 [2024-11-02 12:07:23.023807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.296 [2024-11-02 12:07:23.023829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.296 #51 NEW cov: 11820 ft: 15132 corp: 35/691b lim: 35 exec/s: 51 rss: 69Mb L: 19/33 MS: 1 EraseBytes- 00:07:36.296 [2024-11-02 12:07:23.074029] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.296 [2024-11-02 12:07:23.074563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.296 [2024-11-02 12:07:23.074593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.296 [2024-11-02 12:07:23.074716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.296 [2024-11-02 12:07:23.074742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.296 [2024-11-02 12:07:23.074864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff00002f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.296 [2024-11-02 12:07:23.074882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.296 #52 NEW cov: 11820 ft: 15198 corp: 36/715b lim: 35 exec/s: 26 rss: 69Mb L: 24/33 MS: 1 ChangeByte- 00:07:36.296 #52 DONE cov: 11820 ft: 15198 corp: 36/715b lim: 35 exec/s: 26 rss: 69Mb 00:07:36.297 ###### Recommended dictionary. ###### 00:07:36.297 "\000\000\000\000\000\000\000\007" # Uses: 3 00:07:36.297 "\016\000\000\000\000\000\000\000" # Uses: 1 00:07:36.297 "\000\000\000\000\000\000\004\000" # Uses: 3 00:07:36.297 ###### End of recommended dictionary. ###### 00:07:36.297 Done 52 runs in 2 second(s) 00:07:36.297 12:07:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:36.297 12:07:23 -- ../common.sh@72 -- # (( i++ )) 00:07:36.297 12:07:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.297 12:07:23 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:36.297 12:07:23 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:36.297 12:07:23 -- nvmf/run.sh@24 -- # local timen=1 00:07:36.297 12:07:23 -- nvmf/run.sh@25 -- # local core=0x1 00:07:36.297 12:07:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:36.297 12:07:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:36.297 12:07:23 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:36.297 12:07:23 -- nvmf/run.sh@29 -- # port=4403 00:07:36.297 12:07:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:36.297 12:07:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:36.297 12:07:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:36.297 12:07:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:36.297 [2024-11-02 12:07:23.259349] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:36.297 [2024-11-02 12:07:23.259426] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1145060 ] 00:07:36.556 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.556 [2024-11-02 12:07:23.519715] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.815 [2024-11-02 12:07:23.547437] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.815 [2024-11-02 12:07:23.547558] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.815 [2024-11-02 12:07:23.599590] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:36.815 [2024-11-02 12:07:23.615964] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:36.815 INFO: Running with entropic power schedule (0xFF, 100). 00:07:36.815 INFO: Seed: 340697788 00:07:36.815 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:36.815 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:36.815 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:36.815 INFO: A corpus is not provided, starting from an empty corpus 00:07:36.815 #2 INITED exec/s: 0 rss: 59Mb 00:07:36.815 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:36.815 This may also happen if the target rejected all inputs we tried so far 00:07:37.075 NEW_FUNC[1/659]: 0x456418 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:37.075 NEW_FUNC[2/659]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.075 #5 NEW cov: 11480 ft: 11481 corp: 2/5b lim: 20 exec/s: 0 rss: 67Mb L: 4/4 MS: 3 CopyPart-CopyPart-InsertByte- 00:07:37.075 #6 NEW cov: 11607 ft: 12454 corp: 3/14b lim: 20 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CMP- DE: "\234'\000\254\342\177\000\000"- 00:07:37.334 #7 NEW cov: 11617 ft: 13001 corp: 4/26b lim: 20 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 PersAutoDict- DE: "\234'\000\254\342\177\000\000"- 00:07:37.334 #8 NEW cov: 11702 ft: 13318 corp: 5/36b lim: 20 exec/s: 0 rss: 67Mb L: 10/12 MS: 1 InsertByte- 00:07:37.334 #9 NEW cov: 11702 ft: 13457 corp: 6/40b lim: 20 exec/s: 0 rss: 67Mb L: 4/12 MS: 1 ChangeBit- 00:07:37.334 #10 NEW cov: 11702 ft: 13517 corp: 7/52b lim: 20 exec/s: 0 rss: 67Mb L: 12/12 MS: 1 ShuffleBytes- 00:07:37.334 NEW_FUNC[1/4]: 0x1137598 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:37.334 NEW_FUNC[2/4]: 0x1138118 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:37.334 #11 NEW cov: 11786 ft: 13666 corp: 8/66b lim: 20 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:07:37.334 #12 NEW cov: 11786 ft: 13708 corp: 9/76b lim: 20 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 CrossOver- 00:07:37.592 #13 NEW cov: 11786 ft: 13774 corp: 10/88b lim: 20 exec/s: 0 rss: 68Mb L: 12/14 MS: 1 PersAutoDict- DE: "\234'\000\254\342\177\000\000"- 00:07:37.592 #14 NEW cov: 11786 ft: 13802 corp: 11/102b lim: 20 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:07:37.592 #15 NEW cov: 11786 ft: 13820 corp: 12/116b lim: 20 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 ChangeBinInt- 00:07:37.592 #16 NEW cov: 11786 ft: 13844 corp: 13/129b lim: 20 exec/s: 0 rss: 68Mb L: 13/14 MS: 1 InsertByte- 00:07:37.592 #17 NEW cov: 11786 ft: 13875 corp: 14/140b lim: 20 exec/s: 0 rss: 68Mb L: 11/14 MS: 1 CrossOver- 00:07:37.592 #18 NEW cov: 11786 ft: 13982 corp: 15/153b lim: 20 exec/s: 0 rss: 68Mb L: 13/14 MS: 1 ChangeByte- 00:07:37.592 [2024-11-02 12:07:24.534642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:37.592 [2024-11-02 12:07:24.534683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.851 NEW_FUNC[1/16]: 0x1550518 in nvme_ctrlr_process_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3091 00:07:37.851 NEW_FUNC[2/16]: 0x15e68a8 in nvme_ctrlr_queue_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3131 00:07:37.851 #19 NEW cov: 12026 ft: 14272 corp: 16/163b lim: 20 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 InsertRepeatedBytes- 00:07:37.851 #20 NEW cov: 12026 ft: 14300 corp: 17/173b lim: 20 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 ChangeBinInt- 00:07:37.851 #21 NEW cov: 12026 ft: 14323 corp: 18/187b lim: 20 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 ChangeBit- 00:07:37.851 #22 NEW cov: 12042 ft: 14535 corp: 19/207b lim: 20 exec/s: 22 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:37.851 #23 NEW cov: 12042 ft: 14597 corp: 20/221b lim: 20 exec/s: 23 rss: 68Mb L: 14/20 MS: 1 CopyPart- 00:07:37.851 #24 NEW cov: 12043 ft: 14619 corp: 21/239b lim: 20 exec/s: 24 rss: 68Mb L: 18/20 MS: 1 CrossOver- 00:07:37.851 #25 NEW cov: 12043 ft: 14642 corp: 22/251b lim: 20 exec/s: 25 rss: 68Mb L: 12/20 MS: 1 ChangeByte- 00:07:38.110 #26 NEW cov: 12043 ft: 14663 corp: 23/261b lim: 20 exec/s: 26 rss: 68Mb L: 10/20 MS: 1 InsertRepeatedBytes- 00:07:38.110 #29 NEW cov: 12043 ft: 14672 corp: 24/280b lim: 20 exec/s: 29 rss: 68Mb L: 19/20 MS: 3 EraseBytes-CrossOver-InsertRepeatedBytes- 00:07:38.110 #30 NEW cov: 12043 ft: 14690 corp: 25/284b lim: 20 exec/s: 30 rss: 68Mb L: 4/20 MS: 1 ChangeByte- 00:07:38.110 #31 NEW cov: 12043 ft: 14697 corp: 26/298b lim: 20 exec/s: 31 rss: 68Mb L: 14/20 MS: 1 InsertByte- 00:07:38.110 #32 NEW cov: 12043 ft: 14707 corp: 27/302b lim: 20 exec/s: 32 rss: 68Mb L: 4/20 MS: 1 ChangeBinInt- 00:07:38.110 #33 NEW cov: 12043 ft: 14730 corp: 28/308b lim: 20 exec/s: 33 rss: 68Mb L: 6/20 MS: 1 EraseBytes- 00:07:38.369 #34 NEW cov: 12043 ft: 14756 corp: 29/326b lim: 20 exec/s: 34 rss: 68Mb L: 18/20 MS: 1 InsertRepeatedBytes- 00:07:38.370 #35 NEW cov: 12043 ft: 14813 corp: 30/333b lim: 20 exec/s: 35 rss: 69Mb L: 7/20 MS: 1 EraseBytes- 00:07:38.370 #36 NEW cov: 12043 ft: 14834 corp: 31/337b lim: 20 exec/s: 36 rss: 69Mb L: 4/20 MS: 1 ChangeByte- 00:07:38.370 #37 NEW cov: 12043 ft: 14857 corp: 32/357b lim: 20 exec/s: 37 rss: 69Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:38.370 #38 NEW cov: 12043 ft: 14862 corp: 33/377b lim: 20 exec/s: 38 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:38.370 #39 NEW cov: 12052 ft: 14914 corp: 34/389b lim: 20 exec/s: 39 rss: 69Mb L: 12/20 MS: 1 CMP- DE: "b\006\375\001\3449\177\000"- 00:07:38.629 #40 NEW cov: 12052 ft: 14955 corp: 35/397b lim: 20 exec/s: 40 rss: 69Mb L: 8/20 MS: 1 EraseBytes- 00:07:38.629 #41 NEW cov: 12052 ft: 14969 corp: 36/407b lim: 20 exec/s: 41 rss: 69Mb L: 10/20 MS: 1 ShuffleBytes- 00:07:38.629 #42 NEW cov: 12052 ft: 14990 corp: 37/419b lim: 20 exec/s: 42 rss: 69Mb L: 12/20 MS: 1 ChangeBinInt- 00:07:38.629 #43 NEW cov: 12052 ft: 14994 corp: 38/424b lim: 20 exec/s: 43 rss: 69Mb L: 5/20 MS: 1 CrossOver- 00:07:38.629 #44 NEW cov: 12052 ft: 15001 corp: 39/437b lim: 20 exec/s: 44 rss: 69Mb L: 13/20 MS: 1 CrossOver- 00:07:38.629 #45 NEW cov: 12052 ft: 15034 corp: 40/446b lim: 20 exec/s: 45 rss: 69Mb L: 9/20 MS: 1 EraseBytes- 00:07:38.888 #46 NEW cov: 12052 ft: 15046 corp: 41/451b lim: 20 exec/s: 46 rss: 69Mb L: 5/20 MS: 1 EraseBytes- 00:07:38.888 #47 NEW cov: 12052 ft: 15082 corp: 42/470b lim: 20 exec/s: 23 rss: 69Mb L: 19/20 MS: 1 CopyPart- 00:07:38.888 #47 DONE cov: 12052 ft: 15082 corp: 42/470b lim: 20 exec/s: 23 rss: 69Mb 00:07:38.888 ###### Recommended dictionary. ###### 00:07:38.888 "\234'\000\254\342\177\000\000" # Uses: 2 00:07:38.888 "b\006\375\001\3449\177\000" # Uses: 0 00:07:38.888 ###### End of recommended dictionary. ###### 00:07:38.888 Done 47 runs in 2 second(s) 00:07:38.888 12:07:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:38.888 12:07:25 -- ../common.sh@72 -- # (( i++ )) 00:07:38.888 12:07:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.888 12:07:25 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:38.888 12:07:25 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:38.888 12:07:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.888 12:07:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.888 12:07:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:38.888 12:07:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:38.888 12:07:25 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:38.888 12:07:25 -- nvmf/run.sh@29 -- # port=4404 00:07:38.888 12:07:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:38.888 12:07:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:38.888 12:07:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.888 12:07:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:38.888 [2024-11-02 12:07:25.831555] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:38.888 [2024-11-02 12:07:25.831640] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1145597 ] 00:07:38.888 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.147 [2024-11-02 12:07:26.081800] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.147 [2024-11-02 12:07:26.109012] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.147 [2024-11-02 12:07:26.109155] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.406 [2024-11-02 12:07:26.160633] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:39.406 [2024-11-02 12:07:26.177011] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:39.406 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.406 INFO: Seed: 2901701137 00:07:39.406 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:39.406 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:39.406 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:39.406 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.406 #2 INITED exec/s: 0 rss: 59Mb 00:07:39.406 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.406 This may also happen if the target rejected all inputs we tried so far 00:07:39.406 [2024-11-02 12:07:26.253696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00580a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.406 [2024-11-02 12:07:26.253737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.665 NEW_FUNC[1/671]: 0x457518 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:39.665 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.665 #51 NEW cov: 11605 ft: 11606 corp: 2/11b lim: 35 exec/s: 0 rss: 67Mb L: 10/10 MS: 4 ChangeByte-ChangeByte-CrossOver-CMP- DE: "*\000\000\000\000\000\000\000"- 00:07:39.665 [2024-11-02 12:07:26.563775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00005800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.665 [2024-11-02 12:07:26.563811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.665 #52 NEW cov: 11718 ft: 12300 corp: 3/19b lim: 35 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 EraseBytes- 00:07:39.665 [2024-11-02 12:07:26.613858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00580a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.665 [2024-11-02 12:07:26.613887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.665 #53 NEW cov: 11724 ft: 12693 corp: 4/29b lim: 35 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:39.924 [2024-11-02 12:07:26.654034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00005800 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.924 [2024-11-02 12:07:26.654063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.924 #54 NEW cov: 11809 ft: 12890 corp: 5/37b lim: 35 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 CrossOver- 00:07:39.924 [2024-11-02 12:07:26.704185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a2a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.924 [2024-11-02 12:07:26.704213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.924 #57 NEW cov: 11809 ft: 12923 corp: 6/46b lim: 35 exec/s: 0 rss: 68Mb L: 9/10 MS: 3 ChangeBit-CopyPart-PersAutoDict- DE: "*\000\000\000\000\000\000\000"- 00:07:39.924 [2024-11-02 12:07:26.744606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a2a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.924 [2024-11-02 12:07:26.744635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.924 [2024-11-02 12:07:26.744751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00002a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.924 [2024-11-02 12:07:26.744768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.924 #58 NEW cov: 11809 ft: 13686 corp: 7/63b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 CopyPart- 00:07:39.924 [2024-11-02 12:07:26.794447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f00580a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.924 [2024-11-02 12:07:26.794474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.924 #59 NEW cov: 11809 ft: 13737 corp: 8/73b lim: 35 exec/s: 0 rss: 68Mb L: 10/17 MS: 1 ChangeByte- 00:07:39.924 [2024-11-02 12:07:26.834525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f00580a cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.924 [2024-11-02 12:07:26.834554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.924 #60 NEW cov: 11809 ft: 13782 corp: 9/84b lim: 35 exec/s: 0 rss: 68Mb L: 11/17 MS: 1 InsertByte- 00:07:39.925 [2024-11-02 12:07:26.884739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:58005800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.925 [2024-11-02 12:07:26.884766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.184 #61 NEW cov: 11809 ft: 13862 corp: 10/92b lim: 35 exec/s: 0 rss: 68Mb L: 8/17 MS: 1 ShuffleBytes- 00:07:40.184 [2024-11-02 12:07:26.934883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:2a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.184 [2024-11-02 12:07:26.934911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.184 #62 NEW cov: 11809 ft: 13938 corp: 11/101b lim: 35 exec/s: 0 rss: 68Mb L: 9/17 MS: 1 ShuffleBytes- 00:07:40.184 [2024-11-02 12:07:26.975016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:2a6a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.184 [2024-11-02 12:07:26.975047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.184 #63 NEW cov: 11809 ft: 14003 corp: 12/110b lim: 35 exec/s: 0 rss: 68Mb L: 9/17 MS: 1 ChangeByte- 00:07:40.184 [2024-11-02 12:07:27.025196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00580a cdw11:7e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.184 [2024-11-02 12:07:27.025225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.184 #64 NEW cov: 11809 ft: 14030 corp: 13/121b lim: 35 exec/s: 0 rss: 68Mb L: 11/17 MS: 1 InsertByte- 00:07:40.184 [2024-11-02 12:07:27.075385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f00580a cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.184 [2024-11-02 12:07:27.075415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.184 #65 NEW cov: 11809 ft: 14047 corp: 14/132b lim: 35 exec/s: 0 rss: 69Mb L: 11/17 MS: 1 ChangeBit- 00:07:40.184 [2024-11-02 12:07:27.125524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a77580a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.184 [2024-11-02 12:07:27.125551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.184 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.184 #66 NEW cov: 11832 ft: 14089 corp: 15/143b lim: 35 exec/s: 0 rss: 69Mb L: 11/17 MS: 1 InsertByte- 00:07:40.443 [2024-11-02 12:07:27.165553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000580a cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.165582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.443 #67 NEW cov: 11832 ft: 14116 corp: 16/153b lim: 35 exec/s: 0 rss: 69Mb L: 10/17 MS: 1 EraseBytes- 00:07:40.443 [2024-11-02 12:07:27.205732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00500a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.205758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.443 #68 NEW cov: 11832 ft: 14179 corp: 17/163b lim: 35 exec/s: 68 rss: 69Mb L: 10/17 MS: 1 ChangeBit- 00:07:40.443 [2024-11-02 12:07:27.246033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:2a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.246059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.443 [2024-11-02 12:07:27.246176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:002a0000 cdw11:002a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.246193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.443 #69 NEW cov: 11832 ft: 14225 corp: 18/180b lim: 35 exec/s: 69 rss: 69Mb L: 17/17 MS: 1 CrossOver- 00:07:40.443 [2024-11-02 12:07:27.296024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.296051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.443 #70 NEW cov: 11832 ft: 14248 corp: 19/189b lim: 35 exec/s: 70 rss: 69Mb L: 9/17 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:40.443 [2024-11-02 12:07:27.336057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.336089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.443 #74 NEW cov: 11832 ft: 14266 corp: 20/196b lim: 35 exec/s: 74 rss: 69Mb L: 7/17 MS: 4 ChangeByte-ChangeByte-ChangeBinInt-CrossOver- 00:07:40.443 [2024-11-02 12:07:27.376178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f00581a cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.376206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.443 #75 NEW cov: 11832 ft: 14276 corp: 21/207b lim: 35 exec/s: 75 rss: 69Mb L: 11/17 MS: 1 ChangeBit- 00:07:40.443 [2024-11-02 12:07:27.416621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00580a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.416649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.443 [2024-11-02 12:07:27.416779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.443 [2024-11-02 12:07:27.416798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.703 #76 NEW cov: 11832 ft: 14298 corp: 22/225b lim: 35 exec/s: 76 rss: 69Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:07:40.703 [2024-11-02 12:07:27.456599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00580a cdw11:fbff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.456625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.703 #77 NEW cov: 11832 ft: 14313 corp: 23/235b lim: 35 exec/s: 77 rss: 69Mb L: 10/18 MS: 1 ChangeBinInt- 00:07:40.703 [2024-11-02 12:07:27.496621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00500a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.496665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.703 #78 NEW cov: 11832 ft: 14321 corp: 24/245b lim: 35 exec/s: 78 rss: 69Mb L: 10/18 MS: 1 ShuffleBytes- 00:07:40.703 [2024-11-02 12:07:27.536734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00580a cdw11:7e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.536760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.703 #79 NEW cov: 11832 ft: 14360 corp: 25/256b lim: 35 exec/s: 79 rss: 69Mb L: 11/18 MS: 1 ChangeByte- 00:07:40.703 [2024-11-02 12:07:27.577870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f00581a cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.577898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.703 [2024-11-02 12:07:27.577965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.577983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.703 [2024-11-02 12:07:27.578110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.578129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.703 [2024-11-02 12:07:27.578247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.578265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.703 [2024-11-02 12:07:27.578381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.578396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.703 #80 NEW cov: 11832 ft: 14745 corp: 26/291b lim: 35 exec/s: 80 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:40.703 [2024-11-02 12:07:27.626944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:001f581a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.626970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.703 #82 NEW cov: 11832 ft: 14820 corp: 27/298b lim: 35 exec/s: 82 rss: 69Mb L: 7/35 MS: 2 EraseBytes-InsertByte- 00:07:40.703 [2024-11-02 12:07:27.667349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000500a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.667377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.703 [2024-11-02 12:07:27.667491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:002a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.703 [2024-11-02 12:07:27.667507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.963 #83 NEW cov: 11832 ft: 14829 corp: 28/316b lim: 35 exec/s: 83 rss: 69Mb L: 18/35 MS: 1 InsertRepeatedBytes- 00:07:40.963 [2024-11-02 12:07:27.707237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:002a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.707264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.963 #87 NEW cov: 11832 ft: 14861 corp: 29/328b lim: 35 exec/s: 87 rss: 69Mb L: 12/35 MS: 4 EraseBytes-ChangeBit-CrossOver-CrossOver- 00:07:40.963 [2024-11-02 12:07:27.747371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:6c0f580a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.747397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.963 #88 NEW cov: 11832 ft: 14870 corp: 30/339b lim: 35 exec/s: 88 rss: 69Mb L: 11/35 MS: 1 InsertByte- 00:07:40.963 [2024-11-02 12:07:27.788438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f00581a cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.788464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.963 [2024-11-02 12:07:27.788585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.788602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.963 [2024-11-02 12:07:27.788715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.788733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.963 [2024-11-02 12:07:27.788858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.788877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.963 [2024-11-02 12:07:27.789010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.789028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:40.963 #89 NEW cov: 11832 ft: 14883 corp: 31/374b lim: 35 exec/s: 89 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:40.963 [2024-11-02 12:07:27.837878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000500a cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.837905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.963 [2024-11-02 12:07:27.838045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:002a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.838061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.963 #90 NEW cov: 11832 ft: 14889 corp: 32/392b lim: 35 exec/s: 90 rss: 69Mb L: 18/35 MS: 1 ChangeByte- 00:07:40.963 [2024-11-02 12:07:27.878028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000500a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.878055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.963 [2024-11-02 12:07:27.878172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.878195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.963 #91 NEW cov: 11832 ft: 14895 corp: 33/410b lim: 35 exec/s: 91 rss: 69Mb L: 18/35 MS: 1 ShuffleBytes- 00:07:40.963 [2024-11-02 12:07:27.918014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffced5 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.918042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.963 [2024-11-02 12:07:27.918167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000d500 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.963 [2024-11-02 12:07:27.918184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.223 #92 NEW cov: 11832 ft: 14898 corp: 34/427b lim: 35 exec/s: 92 rss: 70Mb L: 17/35 MS: 1 ChangeBinInt- 00:07:41.223 [2024-11-02 12:07:27.958215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:002a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:27.958240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:27.958356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000010 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:27.958372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:27.998388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:002a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:27.998415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:27.998530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:27.998549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.223 #94 NEW cov: 11832 ft: 14909 corp: 35/443b lim: 35 exec/s: 94 rss: 70Mb L: 16/35 MS: 2 CMP-ShuffleBytes- DE: "\020\000\000\000"- 00:07:41.223 [2024-11-02 12:07:28.038445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000500a cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.038471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:28.038586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:002a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.038601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.223 #95 NEW cov: 11832 ft: 14915 corp: 36/461b lim: 35 exec/s: 95 rss: 70Mb L: 18/35 MS: 1 ChangeBit- 00:07:41.223 [2024-11-02 12:07:28.079390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f00581a cdw11:00250000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.079417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:28.079527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.079544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:28.079665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffde0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.079681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:28.079790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.079807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:28.079918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.079934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.223 #96 NEW cov: 11832 ft: 14928 corp: 37/496b lim: 35 exec/s: 96 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:07:41.223 [2024-11-02 12:07:28.118439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:29ff580a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.118466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.223 #97 NEW cov: 11832 ft: 14938 corp: 38/506b lim: 35 exec/s: 97 rss: 70Mb L: 10/35 MS: 1 ChangeBinInt- 00:07:41.223 [2024-11-02 12:07:28.159062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2a00580a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.159090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:28.159214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.159231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.223 [2024-11-02 12:07:28.159340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.223 [2024-11-02 12:07:28.159360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.223 #98 NEW cov: 11832 ft: 15147 corp: 39/530b lim: 35 exec/s: 98 rss: 70Mb L: 24/35 MS: 1 InsertRepeatedBytes- 00:07:41.483 [2024-11-02 12:07:28.198915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:58005800 cdw11:002a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.483 [2024-11-02 12:07:28.198943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.483 [2024-11-02 12:07:28.199062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.483 [2024-11-02 12:07:28.199080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.483 #99 NEW cov: 11832 ft: 15151 corp: 40/546b lim: 35 exec/s: 49 rss: 70Mb L: 16/35 MS: 1 PersAutoDict- DE: "*\000\000\000\000\000\000\000"- 00:07:41.483 #99 DONE cov: 11832 ft: 15151 corp: 40/546b lim: 35 exec/s: 49 rss: 70Mb 00:07:41.483 ###### Recommended dictionary. ###### 00:07:41.483 "*\000\000\000\000\000\000\000" # Uses: 2 00:07:41.483 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:41.483 "\020\000\000\000" # Uses: 0 00:07:41.483 ###### End of recommended dictionary. ###### 00:07:41.483 Done 99 runs in 2 second(s) 00:07:41.483 12:07:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:41.483 12:07:28 -- ../common.sh@72 -- # (( i++ )) 00:07:41.483 12:07:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:41.483 12:07:28 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:41.483 12:07:28 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:41.483 12:07:28 -- nvmf/run.sh@24 -- # local timen=1 00:07:41.483 12:07:28 -- nvmf/run.sh@25 -- # local core=0x1 00:07:41.483 12:07:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:41.483 12:07:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:41.483 12:07:28 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:41.483 12:07:28 -- nvmf/run.sh@29 -- # port=4405 00:07:41.483 12:07:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:41.483 12:07:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:41.483 12:07:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:41.483 12:07:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:41.483 [2024-11-02 12:07:28.384561] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:41.483 [2024-11-02 12:07:28.384629] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1146012 ] 00:07:41.483 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.741 [2024-11-02 12:07:28.632538] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.741 [2024-11-02 12:07:28.659851] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:41.741 [2024-11-02 12:07:28.659973] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.742 [2024-11-02 12:07:28.711303] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.000 [2024-11-02 12:07:28.727683] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:42.000 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.000 INFO: Seed: 1155888338 00:07:42.000 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:42.000 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:42.000 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:42.000 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.000 #2 INITED exec/s: 0 rss: 59Mb 00:07:42.000 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.000 This may also happen if the target rejected all inputs we tried so far 00:07:42.000 [2024-11-02 12:07:28.798302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.000 [2024-11-02 12:07:28.798346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.260 NEW_FUNC[1/671]: 0x4596b8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:42.260 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.260 #7 NEW cov: 11616 ft: 11617 corp: 2/11b lim: 45 exec/s: 0 rss: 67Mb L: 10/10 MS: 5 ChangeBit-ChangeBit-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:42.260 [2024-11-02 12:07:29.118507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.260 [2024-11-02 12:07:29.118557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.260 #10 NEW cov: 11729 ft: 12364 corp: 3/28b lim: 45 exec/s: 0 rss: 67Mb L: 17/17 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:42.260 [2024-11-02 12:07:29.158510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4a00600a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.260 [2024-11-02 12:07:29.158540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.260 #14 NEW cov: 11735 ft: 12631 corp: 4/41b lim: 45 exec/s: 0 rss: 67Mb L: 13/17 MS: 4 CopyPart-ChangeBit-InsertByte-CrossOver- 00:07:42.260 [2024-11-02 12:07:29.198633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4a00600a cdw11:00200000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.260 [2024-11-02 12:07:29.198660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.260 #15 NEW cov: 11820 ft: 12899 corp: 5/54b lim: 45 exec/s: 0 rss: 67Mb L: 13/17 MS: 1 ChangeBit- 00:07:42.519 [2024-11-02 12:07:29.238786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4a00600a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.238814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 #31 NEW cov: 11820 ft: 13014 corp: 6/67b lim: 45 exec/s: 0 rss: 67Mb L: 13/17 MS: 1 ChangeBit- 00:07:42.519 [2024-11-02 12:07:29.278806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.278834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 #32 NEW cov: 11820 ft: 13069 corp: 7/77b lim: 45 exec/s: 0 rss: 67Mb L: 10/17 MS: 1 CrossOver- 00:07:42.519 [2024-11-02 12:07:29.319256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.319285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 [2024-11-02 12:07:29.319415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.319438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.519 #33 NEW cov: 11820 ft: 13873 corp: 8/96b lim: 45 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:42.519 [2024-11-02 12:07:29.359078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.359107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 #34 NEW cov: 11820 ft: 13961 corp: 9/106b lim: 45 exec/s: 0 rss: 67Mb L: 10/19 MS: 1 ChangeByte- 00:07:42.519 [2024-11-02 12:07:29.399510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:242424cc cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.399538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 [2024-11-02 12:07:29.399654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.399673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.519 #35 NEW cov: 11820 ft: 13993 corp: 10/126b lim: 45 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 InsertByte- 00:07:42.519 [2024-11-02 12:07:29.449410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4a00600a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.449438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 #36 NEW cov: 11820 ft: 14028 corp: 11/139b lim: 45 exec/s: 0 rss: 67Mb L: 13/20 MS: 1 CopyPart- 00:07:42.519 [2024-11-02 12:07:29.489742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.489769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.519 [2024-11-02 12:07:29.489880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff600006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.519 [2024-11-02 12:07:29.489899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.778 #40 NEW cov: 11820 ft: 14040 corp: 12/160b lim: 45 exec/s: 0 rss: 68Mb L: 21/21 MS: 4 EraseBytes-InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:42.778 [2024-11-02 12:07:29.529856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:242424cc cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.778 [2024-11-02 12:07:29.529884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.778 [2024-11-02 12:07:29.530007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.530024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.779 #46 NEW cov: 11820 ft: 14082 corp: 13/181b lim: 45 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 InsertByte- 00:07:42.779 [2024-11-02 12:07:29.569694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:87000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.569721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.779 #47 NEW cov: 11820 ft: 14104 corp: 14/191b lim: 45 exec/s: 0 rss: 68Mb L: 10/21 MS: 1 ChangeByte- 00:07:42.779 [2024-11-02 12:07:29.610102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.610135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.779 [2024-11-02 12:07:29.610248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:27242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.610265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.779 [2024-11-02 12:07:29.650254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.650282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.779 [2024-11-02 12:07:29.650395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:27242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.650413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.779 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:42.779 #49 NEW cov: 11843 ft: 14138 corp: 15/210b lim: 45 exec/s: 0 rss: 68Mb L: 19/21 MS: 2 ChangeBinInt-ShuffleBytes- 00:07:42.779 [2024-11-02 12:07:29.690409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.690438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.779 [2024-11-02 12:07:29.690553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5a5a005a cdw11:5a5a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.690569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.779 #50 NEW cov: 11843 ft: 14163 corp: 16/235b lim: 45 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:42.779 [2024-11-02 12:07:29.730284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4a00600a cdw11:00200001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.779 [2024-11-02 12:07:29.730311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.779 #51 NEW cov: 11843 ft: 14183 corp: 17/249b lim: 45 exec/s: 0 rss: 68Mb L: 14/25 MS: 1 InsertByte- 00:07:43.038 [2024-11-02 12:07:29.770340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.038 [2024-11-02 12:07:29.770368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.039 #52 NEW cov: 11843 ft: 14207 corp: 18/259b lim: 45 exec/s: 52 rss: 68Mb L: 10/25 MS: 1 ShuffleBytes- 00:07:43.039 [2024-11-02 12:07:29.810974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.811007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.039 [2024-11-02 12:07:29.811126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff600006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.811142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.039 [2024-11-02 12:07:29.811250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.811271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.039 #53 NEW cov: 11843 ft: 14490 corp: 19/288b lim: 45 exec/s: 53 rss: 68Mb L: 29/29 MS: 1 CrossOver- 00:07:43.039 [2024-11-02 12:07:29.860630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0060000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.860659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.039 #54 NEW cov: 11843 ft: 14499 corp: 20/299b lim: 45 exec/s: 54 rss: 68Mb L: 11/29 MS: 1 InsertByte- 00:07:43.039 [2024-11-02 12:07:29.901069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a4a6060 cdw11:000a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.901114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.039 [2024-11-02 12:07:29.901234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.901252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.039 #55 NEW cov: 11843 ft: 14504 corp: 21/325b lim: 45 exec/s: 55 rss: 68Mb L: 26/29 MS: 1 CrossOver- 00:07:43.039 [2024-11-02 12:07:29.941162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a4a6060 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.941191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.039 [2024-11-02 12:07:29.941303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:0a4a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.941321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.039 #56 NEW cov: 11843 ft: 14534 corp: 22/350b lim: 45 exec/s: 56 rss: 68Mb L: 25/29 MS: 1 CopyPart- 00:07:43.039 [2024-11-02 12:07:29.991679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1eff cdw11:ffff0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.991709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.039 [2024-11-02 12:07:29.991831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:97979797 cdw11:97ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.991849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.039 [2024-11-02 12:07:29.991972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:60c40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.039 [2024-11-02 12:07:29.991990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.039 #57 NEW cov: 11843 ft: 14557 corp: 23/379b lim: 45 exec/s: 57 rss: 68Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:43.298 [2024-11-02 12:07:30.041750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.041779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.041904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff600006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.041922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.042048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.042066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.299 #58 NEW cov: 11843 ft: 14634 corp: 24/408b lim: 45 exec/s: 58 rss: 68Mb L: 29/29 MS: 1 ChangeBit- 00:07:43.299 [2024-11-02 12:07:30.091931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.091962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.092098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff600006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.092117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.092237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0000f6f6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.092255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.299 #59 NEW cov: 11843 ft: 14710 corp: 25/437b lim: 45 exec/s: 59 rss: 68Mb L: 29/29 MS: 1 ChangeBinInt- 00:07:43.299 [2024-11-02 12:07:30.141774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a4a6060 cdw11:000a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.141804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.141942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.141961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.299 #60 NEW cov: 11843 ft: 14726 corp: 26/463b lim: 45 exec/s: 60 rss: 69Mb L: 26/29 MS: 1 ChangeBit- 00:07:43.299 [2024-11-02 12:07:30.181585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:24000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.181614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.299 #61 NEW cov: 11843 ft: 14735 corp: 27/474b lim: 45 exec/s: 61 rss: 69Mb L: 11/29 MS: 1 CrossOver- 00:07:43.299 [2024-11-02 12:07:30.222251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.222279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.222402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:005a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.222419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.222539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5a5a5a5a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.222555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.299 #62 NEW cov: 11843 ft: 14776 corp: 28/503b lim: 45 exec/s: 62 rss: 69Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:43.299 [2024-11-02 12:07:30.272204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a4a6060 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.272236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.299 [2024-11-02 12:07:30.272344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:0a4a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.299 [2024-11-02 12:07:30.272361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.558 #63 NEW cov: 11843 ft: 14779 corp: 29/528b lim: 45 exec/s: 63 rss: 69Mb L: 25/29 MS: 1 ChangeByte- 00:07:43.558 [2024-11-02 12:07:30.312092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.312121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.558 #64 NEW cov: 11843 ft: 14828 corp: 30/539b lim: 45 exec/s: 64 rss: 69Mb L: 11/29 MS: 1 ChangeBinInt- 00:07:43.558 [2024-11-02 12:07:30.352468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.352495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.558 [2024-11-02 12:07:30.352618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:002424ff cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.352635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.558 #65 NEW cov: 11843 ft: 14843 corp: 31/560b lim: 45 exec/s: 65 rss: 69Mb L: 21/29 MS: 1 CMP- DE: "\377\000"- 00:07:43.558 [2024-11-02 12:07:30.392776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.392803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.558 [2024-11-02 12:07:30.392922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff600006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.392939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.558 [2024-11-02 12:07:30.393060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.393075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.558 #66 NEW cov: 11843 ft: 14844 corp: 32/589b lim: 45 exec/s: 66 rss: 69Mb L: 29/29 MS: 1 ChangeByte- 00:07:43.558 [2024-11-02 12:07:30.432633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24242424 cdw11:24cc0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.432660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.558 [2024-11-02 12:07:30.432772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.432788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.558 #67 NEW cov: 11843 ft: 14871 corp: 33/610b lim: 45 exec/s: 67 rss: 69Mb L: 21/29 MS: 1 ShuffleBytes- 00:07:43.558 [2024-11-02 12:07:30.472538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4a00600a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.472566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.558 #68 NEW cov: 11843 ft: 14889 corp: 34/623b lim: 45 exec/s: 68 rss: 69Mb L: 13/29 MS: 1 ChangeByte- 00:07:43.558 [2024-11-02 12:07:30.512829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a4a6060 cdw11:000a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.512859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.558 [2024-11-02 12:07:30.512973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.558 [2024-11-02 12:07:30.512989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.818 #69 NEW cov: 11843 ft: 14899 corp: 35/649b lim: 45 exec/s: 69 rss: 69Mb L: 26/29 MS: 1 CMP- DE: "\001\000"- 00:07:43.818 [2024-11-02 12:07:30.563092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2d000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.563120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.818 [2024-11-02 12:07:30.563241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.563258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.818 #70 NEW cov: 11843 ft: 14903 corp: 36/667b lim: 45 exec/s: 70 rss: 69Mb L: 18/29 MS: 1 InsertByte- 00:07:43.818 [2024-11-02 12:07:30.603175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a4a6060 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.603203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.818 [2024-11-02 12:07:30.603318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:0a4a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.603337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.818 #71 NEW cov: 11843 ft: 14920 corp: 37/688b lim: 45 exec/s: 71 rss: 69Mb L: 21/29 MS: 1 EraseBytes- 00:07:43.818 [2024-11-02 12:07:30.653401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.653429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.818 [2024-11-02 12:07:30.653556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:24242424 cdw11:24240001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.653573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.818 #72 NEW cov: 11843 ft: 14924 corp: 38/707b lim: 45 exec/s: 72 rss: 69Mb L: 19/29 MS: 1 ChangeBinInt- 00:07:43.818 [2024-11-02 12:07:30.693697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1eff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.693724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.818 [2024-11-02 12:07:30.693846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0097c44a cdw11:97ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.693862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.818 [2024-11-02 12:07:30.693985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:60c40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.694007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.818 #73 NEW cov: 11843 ft: 14930 corp: 39/736b lim: 45 exec/s: 73 rss: 70Mb L: 29/29 MS: 1 CopyPart- 00:07:43.818 [2024-11-02 12:07:30.743269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.818 [2024-11-02 12:07:30.743297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.818 #74 NEW cov: 11843 ft: 15025 corp: 40/747b lim: 45 exec/s: 37 rss: 70Mb L: 11/29 MS: 1 CMP- DE: "\000\000\002\000"- 00:07:43.818 #74 DONE cov: 11843 ft: 15025 corp: 40/747b lim: 45 exec/s: 37 rss: 70Mb 00:07:43.818 ###### Recommended dictionary. ###### 00:07:43.818 "\377\000" # Uses: 0 00:07:43.818 "\001\000" # Uses: 0 00:07:43.818 "\000\000\002\000" # Uses: 0 00:07:43.818 ###### End of recommended dictionary. ###### 00:07:43.818 Done 74 runs in 2 second(s) 00:07:44.078 12:07:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:44.078 12:07:30 -- ../common.sh@72 -- # (( i++ )) 00:07:44.078 12:07:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.078 12:07:30 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:44.078 12:07:30 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:44.078 12:07:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.078 12:07:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.078 12:07:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:44.078 12:07:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:44.078 12:07:30 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:44.078 12:07:30 -- nvmf/run.sh@29 -- # port=4406 00:07:44.078 12:07:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:44.078 12:07:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:44.078 12:07:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.078 12:07:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:44.078 [2024-11-02 12:07:30.929159] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:44.078 [2024-11-02 12:07:30.929245] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1146435 ] 00:07:44.078 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.337 [2024-11-02 12:07:31.182353] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.337 [2024-11-02 12:07:31.208845] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:44.337 [2024-11-02 12:07:31.208979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.337 [2024-11-02 12:07:31.260289] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:44.337 [2024-11-02 12:07:31.276650] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:44.337 INFO: Running with entropic power schedule (0xFF, 100). 00:07:44.337 INFO: Seed: 3704736719 00:07:44.596 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:44.596 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:44.596 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:44.596 INFO: A corpus is not provided, starting from an empty corpus 00:07:44.596 #2 INITED exec/s: 0 rss: 59Mb 00:07:44.596 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:44.596 This may also happen if the target rejected all inputs we tried so far 00:07:44.596 [2024-11-02 12:07:31.347052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:44.596 [2024-11-02 12:07:31.347091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.596 [2024-11-02 12:07:31.347157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b4b cdw11:00000000 00:07:44.596 [2024-11-02 12:07:31.347173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.855 NEW_FUNC[1/669]: 0x45bec8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:44.855 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.855 #3 NEW cov: 11529 ft: 11530 corp: 2/5b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:44.855 [2024-11-02 12:07:31.667255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:44.855 [2024-11-02 12:07:31.667294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.855 [2024-11-02 12:07:31.667413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b1b4 cdw11:00000000 00:07:44.855 [2024-11-02 12:07:31.667430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.855 #4 NEW cov: 11646 ft: 12046 corp: 3/9b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBinInt- 00:07:44.855 [2024-11-02 12:07:31.717264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:44.855 [2024-11-02 12:07:31.717293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.855 [2024-11-02 12:07:31.717414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b1b5 cdw11:00000000 00:07:44.855 [2024-11-02 12:07:31.717429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.855 #5 NEW cov: 11652 ft: 12360 corp: 4/13b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBit- 00:07:44.855 [2024-11-02 12:07:31.757179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:44.855 [2024-11-02 12:07:31.757205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.855 #6 NEW cov: 11737 ft: 12871 corp: 5/16b lim: 10 exec/s: 0 rss: 67Mb L: 3/4 MS: 1 EraseBytes- 00:07:44.855 [2024-11-02 12:07:31.797435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:44.855 [2024-11-02 12:07:31.797462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.855 [2024-11-02 12:07:31.797575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b43 cdw11:00000000 00:07:44.855 [2024-11-02 12:07:31.797591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.855 #7 NEW cov: 11737 ft: 13031 corp: 6/20b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ChangeBinInt- 00:07:45.114 [2024-11-02 12:07:31.837592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.837619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.114 [2024-11-02 12:07:31.837735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004bb4 cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.837754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.114 #8 NEW cov: 11737 ft: 13078 corp: 7/24b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 CrossOver- 00:07:45.114 [2024-11-02 12:07:31.877496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.877524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.114 #9 NEW cov: 11737 ft: 13201 corp: 8/27b lim: 10 exec/s: 0 rss: 67Mb L: 3/4 MS: 1 EraseBytes- 00:07:45.114 [2024-11-02 12:07:31.917835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.917861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.114 [2024-11-02 12:07:31.917971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b4b cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.917987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.114 #10 NEW cov: 11737 ft: 13206 corp: 9/31b lim: 10 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 ShuffleBytes- 00:07:45.114 [2024-11-02 12:07:31.958209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.958236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.114 [2024-11-02 12:07:31.958362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.958378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.114 [2024-11-02 12:07:31.958495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004bb1 cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.958511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.114 #11 NEW cov: 11737 ft: 13491 corp: 10/37b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:45.114 [2024-11-02 12:07:31.998097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.998124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.114 [2024-11-02 12:07:31.998243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b1b4 cdw11:00000000 00:07:45.114 [2024-11-02 12:07:31.998260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.114 #12 NEW cov: 11737 ft: 13515 corp: 11/41b lim: 10 exec/s: 0 rss: 67Mb L: 4/6 MS: 1 ShuffleBytes- 00:07:45.114 [2024-11-02 12:07:32.038264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.114 [2024-11-02 12:07:32.038292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.114 [2024-11-02 12:07:32.038411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b1bb cdw11:00000000 00:07:45.114 [2024-11-02 12:07:32.038428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.114 #13 NEW cov: 11737 ft: 13569 corp: 12/45b lim: 10 exec/s: 0 rss: 67Mb L: 4/6 MS: 1 ChangeByte- 00:07:45.115 [2024-11-02 12:07:32.078374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001a4b cdw11:00000000 00:07:45.115 [2024-11-02 12:07:32.078400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.115 [2024-11-02 12:07:32.078516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b43 cdw11:00000000 00:07:45.115 [2024-11-02 12:07:32.078533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.374 #14 NEW cov: 11737 ft: 13578 corp: 13/49b lim: 10 exec/s: 0 rss: 67Mb L: 4/6 MS: 1 ChangeBit- 00:07:45.374 [2024-11-02 12:07:32.118930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.118956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.119078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b4b cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.119096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.119211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.119227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.119342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.119357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.374 #15 NEW cov: 11737 ft: 13794 corp: 14/57b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:45.374 [2024-11-02 12:07:32.158534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.158560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.158664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004bb0 cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.158682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.374 #16 NEW cov: 11737 ft: 13839 corp: 15/61b lim: 10 exec/s: 0 rss: 68Mb L: 4/8 MS: 1 ChangeBit- 00:07:45.374 [2024-11-02 12:07:32.198538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b1bb cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.198565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.374 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:45.374 #17 NEW cov: 11760 ft: 13858 corp: 16/63b lim: 10 exec/s: 0 rss: 68Mb L: 2/8 MS: 1 EraseBytes- 00:07:45.374 [2024-11-02 12:07:32.238785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008aff cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.238811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.238936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.238952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.374 #20 NEW cov: 11760 ft: 13872 corp: 17/67b lim: 10 exec/s: 0 rss: 68Mb L: 4/8 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:45.374 [2024-11-02 12:07:32.278931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.278957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.279075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000025b1 cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.279091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.374 #21 NEW cov: 11760 ft: 13941 corp: 18/72b lim: 10 exec/s: 0 rss: 68Mb L: 5/8 MS: 1 InsertByte- 00:07:45.374 [2024-11-02 12:07:32.319081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.319108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.319225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.319244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.374 #22 NEW cov: 11760 ft: 13960 corp: 19/76b lim: 10 exec/s: 22 rss: 68Mb L: 4/8 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:45.374 [2024-11-02 12:07:32.349199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.349226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.374 [2024-11-02 12:07:32.349335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000055b4 cdw11:00000000 00:07:45.374 [2024-11-02 12:07:32.349350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.633 #23 NEW cov: 11760 ft: 13979 corp: 20/80b lim: 10 exec/s: 23 rss: 68Mb L: 4/8 MS: 1 ChangeBinInt- 00:07:45.633 [2024-11-02 12:07:32.389105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f621 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.389133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.633 #26 NEW cov: 11760 ft: 13986 corp: 21/82b lim: 10 exec/s: 26 rss: 68Mb L: 2/8 MS: 3 ShuffleBytes-ChangeByte-InsertByte- 00:07:45.633 [2024-11-02 12:07:32.429566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.429593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.633 [2024-11-02 12:07:32.429715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.429732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.633 [2024-11-02 12:07:32.429851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004bea cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.429868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.633 #27 NEW cov: 11760 ft: 14010 corp: 22/88b lim: 10 exec/s: 27 rss: 68Mb L: 6/8 MS: 1 ChangeByte- 00:07:45.633 [2024-11-02 12:07:32.469225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.469253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.633 #28 NEW cov: 11760 ft: 14028 corp: 23/91b lim: 10 exec/s: 28 rss: 68Mb L: 3/8 MS: 1 EraseBytes- 00:07:45.633 [2024-11-02 12:07:32.509691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.509718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.633 [2024-11-02 12:07:32.509843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.509863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.633 #29 NEW cov: 11760 ft: 14060 corp: 24/95b lim: 10 exec/s: 29 rss: 68Mb L: 4/8 MS: 1 ShuffleBytes- 00:07:45.633 [2024-11-02 12:07:32.549826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a55 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.549853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.633 [2024-11-02 12:07:32.549968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004bb4 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.549984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.633 #30 NEW cov: 11760 ft: 14077 corp: 25/99b lim: 10 exec/s: 30 rss: 68Mb L: 4/8 MS: 1 ShuffleBytes- 00:07:45.633 [2024-11-02 12:07:32.590095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001a4b cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.590122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.633 [2024-11-02 12:07:32.590248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b43 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.590264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.633 [2024-11-02 12:07:32.590398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000102 cdw11:00000000 00:07:45.633 [2024-11-02 12:07:32.590413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.892 #31 NEW cov: 11760 ft: 14093 corp: 26/105b lim: 10 exec/s: 31 rss: 68Mb L: 6/8 MS: 1 CMP- DE: "\001\002"- 00:07:45.892 [2024-11-02 12:07:32.640125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004b0a cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.640152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.640264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004bb4 cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.640282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.892 #32 NEW cov: 11760 ft: 14143 corp: 27/109b lim: 10 exec/s: 32 rss: 68Mb L: 4/8 MS: 1 ShuffleBytes- 00:07:45.892 [2024-11-02 12:07:32.680569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008aff cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.680597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.680724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.680740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.680846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.680863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.680978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.681000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.892 #33 NEW cov: 11760 ft: 14147 corp: 28/117b lim: 10 exec/s: 33 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:45.892 [2024-11-02 12:07:32.730170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.730199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.892 #34 NEW cov: 11760 ft: 14169 corp: 29/120b lim: 10 exec/s: 34 rss: 69Mb L: 3/8 MS: 1 ShuffleBytes- 00:07:45.892 [2024-11-02 12:07:32.780518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.780547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.780659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004ab1 cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.780675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.892 #35 NEW cov: 11760 ft: 14206 corp: 30/125b lim: 10 exec/s: 35 rss: 69Mb L: 5/8 MS: 1 InsertByte- 00:07:45.892 [2024-11-02 12:07:32.820798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.820826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.820950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.820967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.821083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004b43 cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.821103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.892 #36 NEW cov: 11760 ft: 14217 corp: 31/131b lim: 10 exec/s: 36 rss: 69Mb L: 6/8 MS: 1 PersAutoDict- DE: "\001\002"- 00:07:45.892 [2024-11-02 12:07:32.860970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.861003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.861131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000102 cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.861147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.892 [2024-11-02 12:07:32.861261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000025b1 cdw11:00000000 00:07:45.892 [2024-11-02 12:07:32.861278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.151 #37 NEW cov: 11760 ft: 14231 corp: 32/138b lim: 10 exec/s: 37 rss: 69Mb L: 7/8 MS: 1 PersAutoDict- DE: "\001\002"- 00:07:46.151 [2024-11-02 12:07:32.900650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b1bb cdw11:00000000 00:07:46.151 [2024-11-02 12:07:32.900676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.151 #38 NEW cov: 11760 ft: 14252 corp: 33/140b lim: 10 exec/s: 38 rss: 69Mb L: 2/8 MS: 1 ShuffleBytes- 00:07:46.151 [2024-11-02 12:07:32.951047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a28 cdw11:00000000 00:07:46.151 [2024-11-02 12:07:32.951074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.151 [2024-11-02 12:07:32.951194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004b4b cdw11:00000000 00:07:46.151 [2024-11-02 12:07:32.951214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.151 #39 NEW cov: 11760 ft: 14264 corp: 34/145b lim: 10 exec/s: 39 rss: 69Mb L: 5/8 MS: 1 InsertByte- 00:07:46.151 [2024-11-02 12:07:32.990924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004301 cdw11:00000000 00:07:46.151 [2024-11-02 12:07:32.990950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.151 #40 NEW cov: 11760 ft: 14272 corp: 35/148b lim: 10 exec/s: 40 rss: 69Mb L: 3/8 MS: 1 EraseBytes- 00:07:46.151 [2024-11-02 12:07:33.031746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f600 cdw11:00000000 00:07:46.151 [2024-11-02 12:07:33.031773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.151 [2024-11-02 12:07:33.031890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.151 [2024-11-02 12:07:33.031908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.151 [2024-11-02 12:07:33.032023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.151 [2024-11-02 12:07:33.032040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.151 [2024-11-02 12:07:33.032148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.151 [2024-11-02 12:07:33.032168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.151 #42 NEW cov: 11760 ft: 14285 corp: 36/156b lim: 10 exec/s: 42 rss: 69Mb L: 8/8 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:46.151 [2024-11-02 12:07:33.081473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:46.151 [2024-11-02 12:07:33.081501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.151 [2024-11-02 12:07:33.081624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b1b4 cdw11:00000000 00:07:46.151 [2024-11-02 12:07:33.081643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.151 #43 NEW cov: 11760 ft: 14288 corp: 37/160b lim: 10 exec/s: 43 rss: 69Mb L: 4/8 MS: 1 CrossOver- 00:07:46.151 [2024-11-02 12:07:33.121301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a4b cdw11:00000000 00:07:46.151 [2024-11-02 12:07:33.121329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.410 #44 NEW cov: 11760 ft: 14295 corp: 38/163b lim: 10 exec/s: 44 rss: 69Mb L: 3/8 MS: 1 ShuffleBytes- 00:07:46.410 [2024-11-02 12:07:33.161710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000490a cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.161740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.410 [2024-11-02 12:07:33.161870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004bb4 cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.161886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.410 #45 NEW cov: 11760 ft: 14305 corp: 39/167b lim: 10 exec/s: 45 rss: 69Mb L: 4/8 MS: 1 ChangeBit- 00:07:46.410 [2024-11-02 12:07:33.201748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.201776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.410 [2024-11-02 12:07:33.201893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000fe00 cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.201909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.410 #46 NEW cov: 11760 ft: 14347 corp: 40/171b lim: 10 exec/s: 46 rss: 69Mb L: 4/8 MS: 1 ChangeBinInt- 00:07:46.410 [2024-11-02 12:07:33.242100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b1bb cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.242128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.410 [2024-11-02 12:07:33.242254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.242272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.410 [2024-11-02 12:07:33.242388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.242404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.410 #47 NEW cov: 11760 ft: 14366 corp: 41/177b lim: 10 exec/s: 47 rss: 69Mb L: 6/8 MS: 1 InsertRepeatedBytes- 00:07:46.410 [2024-11-02 12:07:33.281966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000490a cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.281992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.410 [2024-11-02 12:07:33.282115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004bb4 cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.282130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.410 #48 NEW cov: 11760 ft: 14373 corp: 42/181b lim: 10 exec/s: 48 rss: 69Mb L: 4/8 MS: 1 ShuffleBytes- 00:07:46.410 [2024-11-02 12:07:33.322346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.322372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.410 [2024-11-02 12:07:33.322486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.322501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.410 [2024-11-02 12:07:33.322625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000030b1 cdw11:00000000 00:07:46.410 [2024-11-02 12:07:33.322643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.410 #49 NEW cov: 11760 ft: 14382 corp: 43/187b lim: 10 exec/s: 24 rss: 69Mb L: 6/8 MS: 1 ChangeByte- 00:07:46.410 #49 DONE cov: 11760 ft: 14382 corp: 43/187b lim: 10 exec/s: 24 rss: 69Mb 00:07:46.410 ###### Recommended dictionary. ###### 00:07:46.410 "\000\000\000\000" # Uses: 0 00:07:46.410 "\001\002" # Uses: 2 00:07:46.410 ###### End of recommended dictionary. ###### 00:07:46.410 Done 49 runs in 2 second(s) 00:07:46.669 12:07:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:46.669 12:07:33 -- ../common.sh@72 -- # (( i++ )) 00:07:46.669 12:07:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:46.669 12:07:33 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:46.669 12:07:33 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:46.669 12:07:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:46.669 12:07:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:46.669 12:07:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:46.669 12:07:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:46.669 12:07:33 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:46.669 12:07:33 -- nvmf/run.sh@29 -- # port=4407 00:07:46.669 12:07:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:46.669 12:07:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:46.669 12:07:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:46.669 12:07:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:46.669 [2024-11-02 12:07:33.499331] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:46.669 [2024-11-02 12:07:33.499396] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1146976 ] 00:07:46.669 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.928 [2024-11-02 12:07:33.751197] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.928 [2024-11-02 12:07:33.780399] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:46.928 [2024-11-02 12:07:33.780524] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.928 [2024-11-02 12:07:33.831973] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.928 [2024-11-02 12:07:33.848338] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:46.928 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.928 INFO: Seed: 1981773310 00:07:46.928 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:46.928 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:46.928 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:46.928 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.928 #2 INITED exec/s: 0 rss: 59Mb 00:07:46.928 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.928 This may also happen if the target rejected all inputs we tried so far 00:07:46.928 [2024-11-02 12:07:33.897467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000288e cdw11:00000000 00:07:46.928 [2024-11-02 12:07:33.897495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.446 NEW_FUNC[1/669]: 0x45c8c8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:47.446 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.447 #4 NEW cov: 11533 ft: 11534 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 2 ChangeByte-InsertByte- 00:07:47.447 [2024-11-02 12:07:34.188309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.188346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.188406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.188422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.188478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.188494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.447 #7 NEW cov: 11646 ft: 12258 corp: 3/10b lim: 10 exec/s: 0 rss: 68Mb L: 7/7 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:07:47.447 [2024-11-02 12:07:34.238508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.238535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.238599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f932 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.238613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.238664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e939 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.238677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.238724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.238737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.447 #8 NEW cov: 11652 ft: 12790 corp: 4/19b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CMP- DE: "X\206\3712\3519\177\000"- 00:07:47.447 [2024-11-02 12:07:34.278692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.278718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.278766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f932 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.278780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.278827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008ee9 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.278841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.278888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000397f cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.278900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.278947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.278959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.447 #9 NEW cov: 11737 ft: 13020 corp: 5/29b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CrossOver- 00:07:47.447 [2024-11-02 12:07:34.318573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.318598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.318665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffe3 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.318679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.318729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.318742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.447 #10 NEW cov: 11737 ft: 13122 corp: 6/36b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 ChangeByte- 00:07:47.447 [2024-11-02 12:07:34.358914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.358939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.358987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f932 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.359005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.359055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e90a cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.359068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.359115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000397f cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.359128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.447 [2024-11-02 12:07:34.359178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.359191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.447 #11 NEW cov: 11737 ft: 13177 corp: 7/46b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:47.447 [2024-11-02 12:07:34.398549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d128 cdw11:00000000 00:07:47.447 [2024-11-02 12:07:34.398574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.447 #13 NEW cov: 11737 ft: 13284 corp: 8/48b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 2 EraseBytes-InsertByte- 00:07:47.706 [2024-11-02 12:07:34.439032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:47.706 [2024-11-02 12:07:34.439057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.706 [2024-11-02 12:07:34.439106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f939 cdw11:00000000 00:07:47.706 [2024-11-02 12:07:34.439118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.706 [2024-11-02 12:07:34.439166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:47.706 [2024-11-02 12:07:34.439178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.706 [2024-11-02 12:07:34.439242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:47.706 [2024-11-02 12:07:34.439255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.706 #14 NEW cov: 11737 ft: 13396 corp: 9/57b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 CrossOver- 00:07:47.706 [2024-11-02 12:07:34.479128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.706 [2024-11-02 12:07:34.479153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.706 [2024-11-02 12:07:34.479201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.706 [2024-11-02 12:07:34.479215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.479267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.479280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.479327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.479340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.707 #17 NEW cov: 11737 ft: 13421 corp: 10/66b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:07:47.707 [2024-11-02 12:07:34.519389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000058f9 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.519414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.519462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e986 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.519475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.519523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000320a cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.519536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.519584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000397f cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.519596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.519643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.519656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.707 #18 NEW cov: 11737 ft: 13447 corp: 11/76b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:47.707 [2024-11-02 12:07:34.559307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.559332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.559382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f939 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.559395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.559445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.559457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.707 #19 NEW cov: 11737 ft: 13466 corp: 12/83b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 EraseBytes- 00:07:47.707 [2024-11-02 12:07:34.599529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.599554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.599603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f932 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.599616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.599664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008ee9 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.599680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.599727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00003900 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.599740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.707 #20 NEW cov: 11737 ft: 13492 corp: 13/92b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 CrossOver- 00:07:47.707 [2024-11-02 12:07:34.639597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.639621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.639670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.639683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.639731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.639744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.639791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.639803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.707 #21 NEW cov: 11737 ft: 13577 corp: 14/101b lim: 10 exec/s: 0 rss: 68Mb L: 9/10 MS: 1 CopyPart- 00:07:47.707 [2024-11-02 12:07:34.679790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f939 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.679815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.679863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.679877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.679926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.679938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.707 [2024-11-02 12:07:34.679987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000ad1 cdw11:00000000 00:07:47.707 [2024-11-02 12:07:34.680005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.966 #23 NEW cov: 11737 ft: 13620 corp: 15/109b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 2 EraseBytes-CrossOver- 00:07:47.966 [2024-11-02 12:07:34.719880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.966 [2024-11-02 12:07:34.719905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.966 [2024-11-02 12:07:34.719956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.966 [2024-11-02 12:07:34.719969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.966 [2024-11-02 12:07:34.720019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.966 [2024-11-02 12:07:34.720034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.966 [2024-11-02 12:07:34.720101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000002ff cdw11:00000000 00:07:47.966 [2024-11-02 12:07:34.720114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.966 #24 NEW cov: 11737 ft: 13654 corp: 16/117b lim: 10 exec/s: 0 rss: 69Mb L: 8/10 MS: 1 CMP- DE: "\377\377\377\377\377\377\002\377"- 00:07:47.966 [2024-11-02 12:07:34.759672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:47.966 [2024-11-02 12:07:34.759697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.966 #25 NEW cov: 11737 ft: 13681 corp: 17/119b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 CopyPart- 00:07:47.966 [2024-11-02 12:07:34.800072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.800097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.967 [2024-11-02 12:07:34.800144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.800156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.967 [2024-11-02 12:07:34.800205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.800217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.967 [2024-11-02 12:07:34.800264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000c6 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.800276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.967 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:47.967 #26 NEW cov: 11760 ft: 13724 corp: 18/127b lim: 10 exec/s: 0 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:07:47.967 [2024-11-02 12:07:34.839904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002892 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.839928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.967 #27 NEW cov: 11760 ft: 13755 corp: 19/130b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:07:47.967 [2024-11-02 12:07:34.880351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.880376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.967 [2024-11-02 12:07:34.880440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.880454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.967 [2024-11-02 12:07:34.880502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.880515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.967 [2024-11-02 12:07:34.880563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000c6 cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.880576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.967 #28 NEW cov: 11760 ft: 13778 corp: 20/138b lim: 10 exec/s: 28 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:07:47.967 [2024-11-02 12:07:34.920107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:47.967 [2024-11-02 12:07:34.920132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.226 #29 NEW cov: 11760 ft: 13790 corp: 21/140b lim: 10 exec/s: 29 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:48.226 [2024-11-02 12:07:34.960481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:48.226 [2024-11-02 12:07:34.960506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:34.960557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.226 [2024-11-02 12:07:34.960570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:34.960619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.226 [2024-11-02 12:07:34.960632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.226 #30 NEW cov: 11760 ft: 13840 corp: 22/147b lim: 10 exec/s: 30 rss: 69Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:07:48.226 [2024-11-02 12:07:35.000795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.000820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.000868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000041 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.000880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.000929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.000941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.001006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.001019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.001068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:000000c6 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.001081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.226 #31 NEW cov: 11760 ft: 13860 corp: 23/157b lim: 10 exec/s: 31 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:07:48.226 [2024-11-02 12:07:35.040835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000039ff cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.040861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.040907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.040921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.040969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.040983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.041037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.041050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.226 #34 NEW cov: 11760 ft: 13915 corp: 24/166b lim: 10 exec/s: 34 rss: 69Mb L: 9/10 MS: 3 ShuffleBytes-ChangeByte-PersAutoDict- DE: "\377\377\377\377\377\377\002\377"- 00:07:48.226 [2024-11-02 12:07:35.080940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.080965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.081030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f932 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.081045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.081103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e939 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.081116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.226 [2024-11-02 12:07:35.081163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f00 cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.081177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.226 #35 NEW cov: 11760 ft: 13924 corp: 25/175b lim: 10 exec/s: 35 rss: 69Mb L: 9/10 MS: 1 PersAutoDict- DE: "X\206\3712\3519\177\000"- 00:07:48.226 [2024-11-02 12:07:35.120713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d1cb cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.120738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.226 #36 NEW cov: 11760 ft: 13944 corp: 26/178b lim: 10 exec/s: 36 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:07:48.226 [2024-11-02 12:07:35.160826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d1cb cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.160850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.226 #37 NEW cov: 11760 ft: 13958 corp: 27/181b lim: 10 exec/s: 37 rss: 69Mb L: 3/10 MS: 1 ChangeBit- 00:07:48.226 [2024-11-02 12:07:35.201390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.226 [2024-11-02 12:07:35.201416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.227 [2024-11-02 12:07:35.201466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.227 [2024-11-02 12:07:35.201479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.227 [2024-11-02 12:07:35.201528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.227 [2024-11-02 12:07:35.201541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.227 [2024-11-02 12:07:35.201590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000002ff cdw11:00000000 00:07:48.227 [2024-11-02 12:07:35.201603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.486 #38 NEW cov: 11760 ft: 13989 corp: 28/189b lim: 10 exec/s: 38 rss: 69Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:48.486 [2024-11-02 12:07:35.241324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.241352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.241402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.241415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.241464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.241477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.486 #39 NEW cov: 11760 ft: 13996 corp: 29/196b lim: 10 exec/s: 39 rss: 69Mb L: 7/10 MS: 1 EraseBytes- 00:07:48.486 [2024-11-02 12:07:35.281648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005886 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.281674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.281726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f932 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.281739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.281787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008e93 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.281800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.281848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000397f cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.281860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.281908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.281921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.486 #40 NEW cov: 11760 ft: 14009 corp: 30/206b lim: 10 exec/s: 40 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:48.486 [2024-11-02 12:07:35.321331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.321356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.486 #43 NEW cov: 11760 ft: 14013 corp: 31/208b lim: 10 exec/s: 43 rss: 69Mb L: 2/10 MS: 3 ShuffleBytes-ChangeBinInt-CopyPart- 00:07:48.486 [2024-11-02 12:07:35.351826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.351850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.351923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007f39 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.351937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.351985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e9d9 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.352003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.352049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00008ac2 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.352065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.486 [2024-11-02 12:07:35.352111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00004ec6 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.352123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.486 #44 NEW cov: 11760 ft: 14023 corp: 32/218b lim: 10 exec/s: 44 rss: 69Mb L: 10/10 MS: 1 CMP- DE: "\000\1779\351\331\212\302N"- 00:07:48.486 [2024-11-02 12:07:35.391535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d12a cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.391559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.486 #45 NEW cov: 11760 ft: 14030 corp: 33/220b lim: 10 exec/s: 45 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:48.486 [2024-11-02 12:07:35.421565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:00000000 00:07:48.486 [2024-11-02 12:07:35.421589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.486 #46 NEW cov: 11760 ft: 14049 corp: 34/222b lim: 10 exec/s: 46 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:48.746 [2024-11-02 12:07:35.461894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d1aa cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.461919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.461969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000cb29 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.461982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.746 #47 NEW cov: 11760 ft: 14215 corp: 35/226b lim: 10 exec/s: 47 rss: 70Mb L: 4/10 MS: 1 InsertByte- 00:07:48.746 [2024-11-02 12:07:35.501899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002892 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.501923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.746 #48 NEW cov: 11760 ft: 14231 corp: 36/229b lim: 10 exec/s: 48 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:48.746 [2024-11-02 12:07:35.542283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.542308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.542354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.542367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.542431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.542444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.542494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.542507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.746 #49 NEW cov: 11760 ft: 14251 corp: 37/238b lim: 10 exec/s: 49 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:48.746 [2024-11-02 12:07:35.582342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000af9 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.582367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.582420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.582434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.582482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000587f cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.582495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.582542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00003986 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.582555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.746 #50 NEW cov: 11760 ft: 14293 corp: 38/247b lim: 10 exec/s: 50 rss: 70Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:48.746 [2024-11-02 12:07:35.622141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d19b cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.622165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.746 #51 NEW cov: 11760 ft: 14313 corp: 39/249b lim: 10 exec/s: 51 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:48.746 [2024-11-02 12:07:35.662314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d17a cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.662339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.746 #52 NEW cov: 11760 ft: 14321 corp: 40/252b lim: 10 exec/s: 52 rss: 70Mb L: 3/10 MS: 1 InsertByte- 00:07:48.746 [2024-11-02 12:07:35.702678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.702703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.702753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.702766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.702814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.702828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.746 [2024-11-02 12:07:35.702876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000020 cdw11:00000000 00:07:48.746 [2024-11-02 12:07:35.702889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.005 #53 NEW cov: 11760 ft: 14387 corp: 41/261b lim: 10 exec/s: 53 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:49.005 [2024-11-02 12:07:35.742489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002128 cdw11:00000000 00:07:49.005 [2024-11-02 12:07:35.742515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.005 #54 NEW cov: 11760 ft: 14407 corp: 42/263b lim: 10 exec/s: 54 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:49.005 [2024-11-02 12:07:35.773069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:49.005 [2024-11-02 12:07:35.773095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.005 [2024-11-02 12:07:35.773146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.005 [2024-11-02 12:07:35.773166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.773214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.773227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.773277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.773289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.773338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.773351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.006 #55 NEW cov: 11760 ft: 14413 corp: 43/273b lim: 10 exec/s: 55 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:49.006 [2024-11-02 12:07:35.813221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000588e cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.813245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.813295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003986 cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.813308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.813357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000032e9 cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.813370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.813435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007ff9 cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.813448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.813495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.813508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.006 #56 NEW cov: 11760 ft: 14423 corp: 44/283b lim: 10 exec/s: 56 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:49.006 [2024-11-02 12:07:35.852864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0b cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.852889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.006 #57 NEW cov: 11760 ft: 14447 corp: 45/285b lim: 10 exec/s: 57 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:49.006 [2024-11-02 12:07:35.893055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000af9 cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.893081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.006 [2024-11-02 12:07:35.893131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.006 [2024-11-02 12:07:35.893144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.006 #58 NEW cov: 11760 ft: 14455 corp: 46/290b lim: 10 exec/s: 29 rss: 70Mb L: 5/10 MS: 1 EraseBytes- 00:07:49.006 #58 DONE cov: 11760 ft: 14455 corp: 46/290b lim: 10 exec/s: 29 rss: 70Mb 00:07:49.006 ###### Recommended dictionary. ###### 00:07:49.006 "X\206\3712\3519\177\000" # Uses: 1 00:07:49.006 "\377\377\377\377\377\377\002\377" # Uses: 1 00:07:49.006 "\000\1779\351\331\212\302N" # Uses: 0 00:07:49.006 ###### End of recommended dictionary. ###### 00:07:49.006 Done 58 runs in 2 second(s) 00:07:49.264 12:07:36 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:49.264 12:07:36 -- ../common.sh@72 -- # (( i++ )) 00:07:49.264 12:07:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.264 12:07:36 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:49.264 12:07:36 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:49.264 12:07:36 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.264 12:07:36 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.264 12:07:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:49.264 12:07:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:49.264 12:07:36 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:49.264 12:07:36 -- nvmf/run.sh@29 -- # port=4408 00:07:49.264 12:07:36 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:49.264 12:07:36 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:49.264 12:07:36 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.264 12:07:36 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:49.264 [2024-11-02 12:07:36.076165] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:49.264 [2024-11-02 12:07:36.076247] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1147415 ] 00:07:49.264 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.523 [2024-11-02 12:07:36.337454] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.523 [2024-11-02 12:07:36.364964] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:49.523 [2024-11-02 12:07:36.365110] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.523 [2024-11-02 12:07:36.416478] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:49.523 [2024-11-02 12:07:36.432844] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:49.523 INFO: Running with entropic power schedule (0xFF, 100). 00:07:49.523 INFO: Seed: 270786225 00:07:49.523 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:49.523 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:49.524 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:49.524 INFO: A corpus is not provided, starting from an empty corpus 00:07:49.524 [2024-11-02 12:07:36.481477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.524 [2024-11-02 12:07:36.481505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.783 #2 INITED cov: 11561 ft: 11562 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:49.783 [2024-11-02 12:07:36.511591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.511618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.511675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.511689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.783 #3 NEW cov: 11674 ft: 12670 corp: 2/3b lim: 5 exec/s: 0 rss: 65Mb L: 2/2 MS: 1 InsertByte- 00:07:49.783 [2024-11-02 12:07:36.561957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.561983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.562047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.562062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.562122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.562135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.783 #4 NEW cov: 11680 ft: 13136 corp: 3/6b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 InsertByte- 00:07:49.783 [2024-11-02 12:07:36.601686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.601711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.783 #5 NEW cov: 11765 ft: 13546 corp: 4/7b lim: 5 exec/s: 0 rss: 66Mb L: 1/3 MS: 1 ChangeBinInt- 00:07:49.783 [2024-11-02 12:07:36.641930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.641955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.642015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.642029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.783 #6 NEW cov: 11765 ft: 13747 corp: 5/9b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 CrossOver- 00:07:49.783 [2024-11-02 12:07:36.682511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.682537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.682595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.682608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.682662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.682675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.682730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.682743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.682800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.682817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.783 #7 NEW cov: 11765 ft: 14156 corp: 6/14b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 CopyPart- 00:07:49.783 [2024-11-02 12:07:36.732690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.732715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.732772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.732786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.732840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.732853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.732908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.732921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.783 [2024-11-02 12:07:36.732977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.783 [2024-11-02 12:07:36.732990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.043 #8 NEW cov: 11765 ft: 14274 corp: 7/19b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:50.043 [2024-11-02 12:07:36.782251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.782276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.043 #9 NEW cov: 11765 ft: 14433 corp: 8/20b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 EraseBytes- 00:07:50.043 [2024-11-02 12:07:36.822648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.822673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.822733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.822747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.822807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.822820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.043 #10 NEW cov: 11765 ft: 14547 corp: 9/23b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 InsertByte- 00:07:50.043 [2024-11-02 12:07:36.862780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.862805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.862865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.862878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.862951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.862965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.043 #11 NEW cov: 11765 ft: 14579 corp: 10/26b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:50.043 [2024-11-02 12:07:36.902903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.902929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.902988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.903006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.903064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.903078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.043 #12 NEW cov: 11765 ft: 14592 corp: 11/29b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 InsertByte- 00:07:50.043 [2024-11-02 12:07:36.943286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.943311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.943384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.943398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.943452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.943465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.943519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.943533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.943588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.943601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.043 #13 NEW cov: 11765 ft: 14677 corp: 12/34b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeBit- 00:07:50.043 [2024-11-02 12:07:36.983274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.983298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.983359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.983373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.983429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.983442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.043 [2024-11-02 12:07:36.983495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.043 [2024-11-02 12:07:36.983509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.043 #14 NEW cov: 11765 ft: 14686 corp: 13/38b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CopyPart- 00:07:50.302 [2024-11-02 12:07:37.023074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.023099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.302 [2024-11-02 12:07:37.023157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.023171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.302 #15 NEW cov: 11765 ft: 14797 corp: 14/40b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:50.302 [2024-11-02 12:07:37.063066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.063091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.302 #16 NEW cov: 11765 ft: 14845 corp: 15/41b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:50.302 [2024-11-02 12:07:37.103354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.103380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.302 [2024-11-02 12:07:37.103438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.103451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.302 #17 NEW cov: 11765 ft: 14856 corp: 16/43b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeASCIIInt- 00:07:50.302 [2024-11-02 12:07:37.143288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.143313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.302 #18 NEW cov: 11765 ft: 14875 corp: 17/44b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:07:50.302 [2024-11-02 12:07:37.183568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.183593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.302 [2024-11-02 12:07:37.183652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.183665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.302 #19 NEW cov: 11765 ft: 14960 corp: 18/46b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:50.302 [2024-11-02 12:07:37.223831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.223857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.302 [2024-11-02 12:07:37.223932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.223946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.302 [2024-11-02 12:07:37.224008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.302 [2024-11-02 12:07:37.224022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.302 #20 NEW cov: 11765 ft: 14966 corp: 19/49b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 EraseBytes- 00:07:50.302 [2024-11-02 12:07:37.263680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.303 [2024-11-02 12:07:37.263707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.561 #21 NEW cov: 11765 ft: 14980 corp: 20/50b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:50.561 [2024-11-02 12:07:37.304098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.561 [2024-11-02 12:07:37.304125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.561 [2024-11-02 12:07:37.304202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.561 [2024-11-02 12:07:37.304217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.561 [2024-11-02 12:07:37.304274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.561 [2024-11-02 12:07:37.304288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.561 #22 NEW cov: 11765 ft: 15013 corp: 21/53b lim: 5 exec/s: 0 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:07:50.561 [2024-11-02 12:07:37.344061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.561 [2024-11-02 12:07:37.344088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.561 [2024-11-02 12:07:37.344147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.561 [2024-11-02 12:07:37.344161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.561 #23 NEW cov: 11765 ft: 15034 corp: 22/55b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:50.561 [2024-11-02 12:07:37.384491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.561 [2024-11-02 12:07:37.384520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.561 [2024-11-02 12:07:37.384577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.562 [2024-11-02 12:07:37.384592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.562 [2024-11-02 12:07:37.384647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.562 [2024-11-02 12:07:37.384660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.562 [2024-11-02 12:07:37.384717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.562 [2024-11-02 12:07:37.384730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.820 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:50.820 #24 NEW cov: 11788 ft: 15081 corp: 23/59b lim: 5 exec/s: 24 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:50.820 [2024-11-02 12:07:37.674897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.674930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.820 [2024-11-02 12:07:37.674987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.675011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.820 #25 NEW cov: 11788 ft: 15112 corp: 24/61b lim: 5 exec/s: 25 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:07:50.820 [2024-11-02 12:07:37.714917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.714944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.820 [2024-11-02 12:07:37.715018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.715033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.820 #26 NEW cov: 11788 ft: 15134 corp: 25/63b lim: 5 exec/s: 26 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:50.820 [2024-11-02 12:07:37.755178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.755203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.820 [2024-11-02 12:07:37.755287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.755300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.820 [2024-11-02 12:07:37.755355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.755368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.820 #27 NEW cov: 11788 ft: 15153 corp: 26/66b lim: 5 exec/s: 27 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:50.820 [2024-11-02 12:07:37.795314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.820 [2024-11-02 12:07:37.795340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.820 [2024-11-02 12:07:37.795398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.821 [2024-11-02 12:07:37.795412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.821 [2024-11-02 12:07:37.795469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.821 [2024-11-02 12:07:37.795482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.080 #28 NEW cov: 11788 ft: 15157 corp: 27/69b lim: 5 exec/s: 28 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:51.080 [2024-11-02 12:07:37.835538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.835562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.080 [2024-11-02 12:07:37.835629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.835642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.080 [2024-11-02 12:07:37.835693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.835706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.080 [2024-11-02 12:07:37.835759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.835771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.080 #29 NEW cov: 11788 ft: 15203 corp: 28/73b lim: 5 exec/s: 29 rss: 69Mb L: 4/5 MS: 1 InsertByte- 00:07:51.080 [2024-11-02 12:07:37.875524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.875549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.080 [2024-11-02 12:07:37.875606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.875620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.080 [2024-11-02 12:07:37.875674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.875687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.080 #30 NEW cov: 11788 ft: 15228 corp: 29/76b lim: 5 exec/s: 30 rss: 70Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:51.080 [2024-11-02 12:07:37.915456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.915484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.080 [2024-11-02 12:07:37.915555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.915569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.080 #31 NEW cov: 11788 ft: 15237 corp: 30/78b lim: 5 exec/s: 31 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:07:51.080 [2024-11-02 12:07:37.955616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.955641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.080 [2024-11-02 12:07:37.955711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.955725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.080 #32 NEW cov: 11788 ft: 15254 corp: 31/80b lim: 5 exec/s: 32 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:51.080 [2024-11-02 12:07:37.995535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:37.995560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.080 #33 NEW cov: 11788 ft: 15262 corp: 32/81b lim: 5 exec/s: 33 rss: 70Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:51.080 [2024-11-02 12:07:38.035662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.080 [2024-11-02 12:07:38.035687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.339 #34 NEW cov: 11788 ft: 15268 corp: 33/82b lim: 5 exec/s: 34 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:51.339 [2024-11-02 12:07:38.076110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.076135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.339 [2024-11-02 12:07:38.076206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.076219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.339 [2024-11-02 12:07:38.076273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.076286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.339 #35 NEW cov: 11788 ft: 15282 corp: 34/85b lim: 5 exec/s: 35 rss: 70Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:51.339 [2024-11-02 12:07:38.116193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.116217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.339 [2024-11-02 12:07:38.116289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.116306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.339 [2024-11-02 12:07:38.116362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.116376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.339 #36 NEW cov: 11788 ft: 15286 corp: 35/88b lim: 5 exec/s: 36 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:51.339 [2024-11-02 12:07:38.156457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.156481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.339 [2024-11-02 12:07:38.156552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.156566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.339 [2024-11-02 12:07:38.156620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.156634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.339 [2024-11-02 12:07:38.156688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.156701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.339 #37 NEW cov: 11788 ft: 15293 corp: 36/92b lim: 5 exec/s: 37 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:51.339 [2024-11-02 12:07:38.196306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.339 [2024-11-02 12:07:38.196331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.340 [2024-11-02 12:07:38.196387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.340 [2024-11-02 12:07:38.196400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.340 #38 NEW cov: 11788 ft: 15314 corp: 37/94b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:07:51.340 [2024-11-02 12:07:38.236533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.340 [2024-11-02 12:07:38.236557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.340 [2024-11-02 12:07:38.236613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.340 [2024-11-02 12:07:38.236627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.340 [2024-11-02 12:07:38.236682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.340 [2024-11-02 12:07:38.236711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.340 #39 NEW cov: 11788 ft: 15318 corp: 38/97b lim: 5 exec/s: 39 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:51.340 [2024-11-02 12:07:38.276552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.340 [2024-11-02 12:07:38.276577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.340 [2024-11-02 12:07:38.276630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.340 [2024-11-02 12:07:38.276644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.340 #40 NEW cov: 11788 ft: 15331 corp: 39/99b lim: 5 exec/s: 40 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:51.599 [2024-11-02 12:07:38.316630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.316655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.599 [2024-11-02 12:07:38.316711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.316724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.599 #41 NEW cov: 11788 ft: 15342 corp: 40/101b lim: 5 exec/s: 41 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:07:51.599 [2024-11-02 12:07:38.356778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.356806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.599 [2024-11-02 12:07:38.356876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.356890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.599 #42 NEW cov: 11788 ft: 15347 corp: 41/103b lim: 5 exec/s: 42 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:51.599 [2024-11-02 12:07:38.396863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.396888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.599 [2024-11-02 12:07:38.396943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.396957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.599 #43 NEW cov: 11788 ft: 15357 corp: 42/105b lim: 5 exec/s: 43 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:07:51.599 [2024-11-02 12:07:38.437132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.437156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.599 [2024-11-02 12:07:38.437230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.437244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.599 [2024-11-02 12:07:38.437300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.437316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.599 #44 NEW cov: 11788 ft: 15387 corp: 43/108b lim: 5 exec/s: 44 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:51.599 [2024-11-02 12:07:38.477107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.477133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.599 [2024-11-02 12:07:38.477189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.599 [2024-11-02 12:07:38.477202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.599 #45 NEW cov: 11788 ft: 15399 corp: 44/110b lim: 5 exec/s: 22 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:51.599 #45 DONE cov: 11788 ft: 15399 corp: 44/110b lim: 5 exec/s: 22 rss: 70Mb 00:07:51.599 Done 45 runs in 2 second(s) 00:07:51.858 12:07:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:51.858 12:07:38 -- ../common.sh@72 -- # (( i++ )) 00:07:51.858 12:07:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.858 12:07:38 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:51.858 12:07:38 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:51.858 12:07:38 -- nvmf/run.sh@24 -- # local timen=1 00:07:51.858 12:07:38 -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.858 12:07:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:51.858 12:07:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:51.858 12:07:38 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:51.858 12:07:38 -- nvmf/run.sh@29 -- # port=4409 00:07:51.858 12:07:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:51.858 12:07:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:51.858 12:07:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.858 12:07:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:51.858 [2024-11-02 12:07:38.652358] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:51.858 [2024-11-02 12:07:38.652429] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1147814 ] 00:07:51.858 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.117 [2024-11-02 12:07:38.910086] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.117 [2024-11-02 12:07:38.938401] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.117 [2024-11-02 12:07:38.938537] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.117 [2024-11-02 12:07:38.990142] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.117 [2024-11-02 12:07:39.006504] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:52.117 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.117 INFO: Seed: 2846787490 00:07:52.117 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:52.117 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:52.117 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:52.117 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.117 [2024-11-02 12:07:39.083169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.117 [2024-11-02 12:07:39.083214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.375 #2 INITED cov: 11516 ft: 11517 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:52.375 [2024-11-02 12:07:39.133248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.375 [2024-11-02 12:07:39.133278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.632 NEW_FUNC[1/6]: 0xf81d48 in _sock_flush /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1317 00:07:52.632 NEW_FUNC[2/6]: 0x16c3978 in spdk_nvme_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:757 00:07:52.632 #3 NEW cov: 11674 ft: 12061 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:52.632 [2024-11-02 12:07:39.443573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.632 [2024-11-02 12:07:39.443606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.632 [2024-11-02 12:07:39.443729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.632 [2024-11-02 12:07:39.443746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.632 #4 NEW cov: 11680 ft: 13250 corp: 3/4b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:07:52.632 [2024-11-02 12:07:39.493656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.632 [2024-11-02 12:07:39.493684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.632 [2024-11-02 12:07:39.493814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.632 [2024-11-02 12:07:39.493831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.632 #5 NEW cov: 11765 ft: 13411 corp: 4/6b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CrossOver- 00:07:52.632 [2024-11-02 12:07:39.533480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.632 [2024-11-02 12:07:39.533508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.632 #6 NEW cov: 11765 ft: 13546 corp: 5/7b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeBit- 00:07:52.632 [2024-11-02 12:07:39.573612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.632 [2024-11-02 12:07:39.573639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.632 #7 NEW cov: 11765 ft: 13637 corp: 6/8b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:07:52.889 [2024-11-02 12:07:39.613783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.889 [2024-11-02 12:07:39.613810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.889 #8 NEW cov: 11765 ft: 13686 corp: 7/9b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:07:52.889 [2024-11-02 12:07:39.653888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.889 [2024-11-02 12:07:39.653918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.889 #9 NEW cov: 11765 ft: 13739 corp: 8/10b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:07:52.889 [2024-11-02 12:07:39.694210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.889 [2024-11-02 12:07:39.694238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.889 [2024-11-02 12:07:39.694357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.889 [2024-11-02 12:07:39.694384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.889 #10 NEW cov: 11765 ft: 13774 corp: 9/12b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:52.889 [2024-11-02 12:07:39.734672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.889 [2024-11-02 12:07:39.734700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.889 [2024-11-02 12:07:39.734822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.889 [2024-11-02 12:07:39.734841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.890 [2024-11-02 12:07:39.734960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.734978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.890 #11 NEW cov: 11765 ft: 14047 corp: 10/15b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:07:52.890 [2024-11-02 12:07:39.784843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.784870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.890 [2024-11-02 12:07:39.785009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.785027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.890 [2024-11-02 12:07:39.785153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.785171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.890 #12 NEW cov: 11765 ft: 14075 corp: 11/18b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeBinInt- 00:07:52.890 [2024-11-02 12:07:39.824862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.824889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.890 [2024-11-02 12:07:39.825013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.825041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.890 [2024-11-02 12:07:39.825161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.825176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.890 #13 NEW cov: 11765 ft: 14086 corp: 12/21b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 CrossOver- 00:07:52.890 [2024-11-02 12:07:39.864543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.890 [2024-11-02 12:07:39.864570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.148 #14 NEW cov: 11765 ft: 14164 corp: 13/22b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 CopyPart- 00:07:53.148 [2024-11-02 12:07:39.905166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:39.905193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.148 [2024-11-02 12:07:39.905318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:39.905336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.148 [2024-11-02 12:07:39.905458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:39.905475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.148 #15 NEW cov: 11765 ft: 14186 corp: 14/25b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeBinInt- 00:07:53.148 [2024-11-02 12:07:39.944699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:39.944725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.148 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.148 #16 NEW cov: 11788 ft: 14221 corp: 15/26b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 EraseBytes- 00:07:53.148 [2024-11-02 12:07:39.985081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:39.985114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.148 [2024-11-02 12:07:39.985250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:39.985267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.148 #17 NEW cov: 11788 ft: 14241 corp: 16/28b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 CrossOver- 00:07:53.148 [2024-11-02 12:07:40.025635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:40.025661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.148 [2024-11-02 12:07:40.025783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:40.025800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.148 #18 NEW cov: 11788 ft: 14312 corp: 17/30b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 InsertByte- 00:07:53.148 [2024-11-02 12:07:40.065418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:40.065444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.148 [2024-11-02 12:07:40.065563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:40.065579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.148 #19 NEW cov: 11788 ft: 14377 corp: 18/32b lim: 5 exec/s: 19 rss: 68Mb L: 2/3 MS: 1 CrossOver- 00:07:53.148 [2024-11-02 12:07:40.105813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:40.105842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.148 [2024-11-02 12:07:40.105969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:40.105988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.148 [2024-11-02 12:07:40.106116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.148 [2024-11-02 12:07:40.106132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.406 #20 NEW cov: 11788 ft: 14404 corp: 19/35b lim: 5 exec/s: 20 rss: 68Mb L: 3/3 MS: 1 ChangeBit- 00:07:53.406 [2024-11-02 12:07:40.146440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.146466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.146593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.146609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.146742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.146758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.146885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.146901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.147024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.147040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.406 #21 NEW cov: 11788 ft: 14733 corp: 20/40b lim: 5 exec/s: 21 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:53.406 [2024-11-02 12:07:40.195831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.195860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.195990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.196011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.406 #22 NEW cov: 11788 ft: 14742 corp: 21/42b lim: 5 exec/s: 22 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:53.406 [2024-11-02 12:07:40.235964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.235990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.236116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.236133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.406 #23 NEW cov: 11788 ft: 14752 corp: 22/44b lim: 5 exec/s: 23 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:53.406 [2024-11-02 12:07:40.276294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.276323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.276456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.276474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.276603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.276619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.406 #24 NEW cov: 11788 ft: 14763 corp: 23/47b lim: 5 exec/s: 24 rss: 68Mb L: 3/5 MS: 1 ChangeByte- 00:07:53.406 [2024-11-02 12:07:40.325923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.325951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.406 #25 NEW cov: 11788 ft: 14781 corp: 24/48b lim: 5 exec/s: 25 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:07:53.406 [2024-11-02 12:07:40.366366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.366394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.406 [2024-11-02 12:07:40.366511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.406 [2024-11-02 12:07:40.366528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.665 #26 NEW cov: 11788 ft: 14797 corp: 25/50b lim: 5 exec/s: 26 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:53.665 [2024-11-02 12:07:40.417248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.417278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.417394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.417411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.417524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.417541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.417661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.417678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.417799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.417817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.665 #27 NEW cov: 11788 ft: 14810 corp: 26/55b lim: 5 exec/s: 27 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:53.665 [2024-11-02 12:07:40.456863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.456890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.457010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.457028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.457143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.457161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.665 #28 NEW cov: 11788 ft: 14826 corp: 27/58b lim: 5 exec/s: 28 rss: 68Mb L: 3/5 MS: 1 ChangeBit- 00:07:53.665 [2024-11-02 12:07:40.507011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.507040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.507165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.507182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.665 [2024-11-02 12:07:40.507305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.507323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.665 #29 NEW cov: 11788 ft: 14835 corp: 28/61b lim: 5 exec/s: 29 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:07:53.665 [2024-11-02 12:07:40.546871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.665 [2024-11-02 12:07:40.546906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.666 [2024-11-02 12:07:40.547029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.666 [2024-11-02 12:07:40.547047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.666 #30 NEW cov: 11788 ft: 14847 corp: 29/63b lim: 5 exec/s: 30 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:53.666 [2024-11-02 12:07:40.596952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.666 [2024-11-02 12:07:40.596979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.666 [2024-11-02 12:07:40.597094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.666 [2024-11-02 12:07:40.597111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.666 #31 NEW cov: 11788 ft: 14864 corp: 30/65b lim: 5 exec/s: 31 rss: 68Mb L: 2/5 MS: 1 EraseBytes- 00:07:53.666 [2024-11-02 12:07:40.637457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.666 [2024-11-02 12:07:40.637484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.666 [2024-11-02 12:07:40.637605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.666 [2024-11-02 12:07:40.637622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.666 [2024-11-02 12:07:40.637747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.666 [2024-11-02 12:07:40.637763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.925 #32 NEW cov: 11788 ft: 14876 corp: 31/68b lim: 5 exec/s: 32 rss: 68Mb L: 3/5 MS: 1 CMP- DE: "\010\000"- 00:07:53.925 [2024-11-02 12:07:40.677274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.677301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.677423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.677441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.925 #33 NEW cov: 11788 ft: 14884 corp: 32/70b lim: 5 exec/s: 33 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:53.925 [2024-11-02 12:07:40.717637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.717664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.717792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.717809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.717940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.717956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.925 #34 NEW cov: 11788 ft: 14910 corp: 33/73b lim: 5 exec/s: 34 rss: 68Mb L: 3/5 MS: 1 ChangeBit- 00:07:53.925 [2024-11-02 12:07:40.757705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.757732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.757865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.757883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.758002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.758018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.925 #35 NEW cov: 11788 ft: 14931 corp: 34/76b lim: 5 exec/s: 35 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:07:53.925 [2024-11-02 12:07:40.807938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.807966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.808105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.808123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.808248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.808267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.925 #36 NEW cov: 11788 ft: 14941 corp: 35/79b lim: 5 exec/s: 36 rss: 68Mb L: 3/5 MS: 1 CrossOver- 00:07:53.925 [2024-11-02 12:07:40.847706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.847733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.847847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.847862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.925 #37 NEW cov: 11788 ft: 14949 corp: 36/81b lim: 5 exec/s: 37 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:53.925 [2024-11-02 12:07:40.888342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.888369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.888498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.888517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.888644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.888662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.925 [2024-11-02 12:07:40.888780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.925 [2024-11-02 12:07:40.888798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.184 #38 NEW cov: 11788 ft: 14960 corp: 37/85b lim: 5 exec/s: 38 rss: 69Mb L: 4/5 MS: 1 PersAutoDict- DE: "\010\000"- 00:07:54.184 [2024-11-02 12:07:40.938000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:40.938026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.184 [2024-11-02 12:07:40.938162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:40.938180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.184 #39 NEW cov: 11788 ft: 14995 corp: 38/87b lim: 5 exec/s: 39 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:54.184 [2024-11-02 12:07:40.978114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:40.978141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.184 [2024-11-02 12:07:40.978272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:40.978289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.184 #40 NEW cov: 11788 ft: 14999 corp: 39/89b lim: 5 exec/s: 40 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:07:54.184 [2024-11-02 12:07:41.018951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:41.018978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.184 [2024-11-02 12:07:41.019074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:41.019091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.184 [2024-11-02 12:07:41.019219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:41.019234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.184 [2024-11-02 12:07:41.019350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:41.019367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.184 [2024-11-02 12:07:41.019485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.184 [2024-11-02 12:07:41.019500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.184 #41 NEW cov: 11788 ft: 15000 corp: 40/94b lim: 5 exec/s: 41 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:07:54.184 [2024-11-02 12:07:41.058722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.185 [2024-11-02 12:07:41.058749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.185 [2024-11-02 12:07:41.058878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.185 [2024-11-02 12:07:41.058894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.185 [2024-11-02 12:07:41.059024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.185 [2024-11-02 12:07:41.059041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.185 #42 NEW cov: 11788 ft: 15004 corp: 41/97b lim: 5 exec/s: 21 rss: 70Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:54.185 #42 DONE cov: 11788 ft: 15004 corp: 41/97b lim: 5 exec/s: 21 rss: 70Mb 00:07:54.185 ###### Recommended dictionary. ###### 00:07:54.185 "\010\000" # Uses: 1 00:07:54.185 ###### End of recommended dictionary. ###### 00:07:54.185 Done 42 runs in 2 second(s) 00:07:54.444 12:07:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:54.444 12:07:41 -- ../common.sh@72 -- # (( i++ )) 00:07:54.444 12:07:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.444 12:07:41 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:54.444 12:07:41 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:54.444 12:07:41 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.444 12:07:41 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.444 12:07:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:54.444 12:07:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:54.444 12:07:41 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:54.444 12:07:41 -- nvmf/run.sh@29 -- # port=4410 00:07:54.444 12:07:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:54.444 12:07:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:54.444 12:07:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.444 12:07:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:54.444 [2024-11-02 12:07:41.241923] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:54.444 [2024-11-02 12:07:41.241990] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1148351 ] 00:07:54.444 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.703 [2024-11-02 12:07:41.497253] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.703 [2024-11-02 12:07:41.525169] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.703 [2024-11-02 12:07:41.525301] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.703 [2024-11-02 12:07:41.576640] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.703 [2024-11-02 12:07:41.593023] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:54.703 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.703 INFO: Seed: 1135832377 00:07:54.703 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:54.703 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:54.703 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:54.703 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.703 #2 INITED exec/s: 0 rss: 59Mb 00:07:54.703 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.703 This may also happen if the target rejected all inputs we tried so far 00:07:54.703 [2024-11-02 12:07:41.641418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdc0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.703 [2024-11-02 12:07:41.641445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.961 NEW_FUNC[1/670]: 0x45e248 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:54.961 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.961 #10 NEW cov: 11584 ft: 11584 corp: 2/10b lim: 40 exec/s: 0 rss: 66Mb L: 9/9 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:54.961 [2024-11-02 12:07:41.932184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.961 [2024-11-02 12:07:41.932224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.219 #11 NEW cov: 11697 ft: 11984 corp: 3/21b lim: 40 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 CopyPart- 00:07:55.219 [2024-11-02 12:07:41.982237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-02 12:07:41.982264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.219 #12 NEW cov: 11703 ft: 12300 corp: 4/33b lim: 40 exec/s: 0 rss: 68Mb L: 12/12 MS: 1 InsertByte- 00:07:55.219 [2024-11-02 12:07:42.022339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.219 [2024-11-02 12:07:42.022365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.219 #13 NEW cov: 11788 ft: 12653 corp: 5/42b lim: 40 exec/s: 0 rss: 68Mb L: 9/12 MS: 1 CrossOver- 00:07:55.220 [2024-11-02 12:07:42.062578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dc40dcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.062603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.220 [2024-11-02 12:07:42.062661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdc0a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.062674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.220 #14 NEW cov: 11788 ft: 13012 corp: 6/65b lim: 40 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 CopyPart- 00:07:55.220 [2024-11-02 12:07:42.102942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.102967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.220 [2024-11-02 12:07:42.103028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.103045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.220 [2024-11-02 12:07:42.103102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.103115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.220 [2024-11-02 12:07:42.103172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.103185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.220 #15 NEW cov: 11788 ft: 13601 corp: 7/103b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:55.220 [2024-11-02 12:07:42.142669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdc00 cdw11:000000dc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.142695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.220 #21 NEW cov: 11788 ft: 13693 corp: 8/116b lim: 40 exec/s: 0 rss: 68Mb L: 13/38 MS: 1 InsertRepeatedBytes- 00:07:55.220 [2024-11-02 12:07:42.183132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.183158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.220 [2024-11-02 12:07:42.183221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.183234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.220 [2024-11-02 12:07:42.183310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaa0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.183324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.220 [2024-11-02 12:07:42.183386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.220 [2024-11-02 12:07:42.183399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.478 #22 NEW cov: 11788 ft: 13720 corp: 9/154b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ChangeBinInt- 00:07:55.478 [2024-11-02 12:07:42.223304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.478 [2024-11-02 12:07:42.223330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.478 [2024-11-02 12:07:42.223389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.478 [2024-11-02 12:07:42.223402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.478 [2024-11-02 12:07:42.223461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.478 [2024-11-02 12:07:42.223474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.478 [2024-11-02 12:07:42.223538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.478 [2024-11-02 12:07:42.223551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.478 #23 NEW cov: 11788 ft: 13745 corp: 10/192b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 CopyPart- 00:07:55.478 [2024-11-02 12:07:42.263381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.263406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.479 [2024-11-02 12:07:42.263466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.263479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.479 [2024-11-02 12:07:42.263535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaa0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.263549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.479 [2024-11-02 12:07:42.263605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.263618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.479 #24 NEW cov: 11788 ft: 13862 corp: 11/230b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ChangeByte- 00:07:55.479 [2024-11-02 12:07:42.303551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.303576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.479 [2024-11-02 12:07:42.303638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.303652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.479 [2024-11-02 12:07:42.303711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.303724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.479 [2024-11-02 12:07:42.303783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0faaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.303796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.479 #25 NEW cov: 11788 ft: 13937 corp: 12/268b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 CMP- DE: "\377\017"- 00:07:55.479 [2024-11-02 12:07:42.343236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:017f39ed cdw11:e7ce0de2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.343261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.479 #29 NEW cov: 11788 ft: 14019 corp: 13/277b lim: 40 exec/s: 0 rss: 68Mb L: 9/38 MS: 4 CrossOver-ShuffleBytes-ChangeBit-CMP- DE: "\001\1779\355\347\316\015\342"- 00:07:55.479 [2024-11-02 12:07:42.383458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dc40dcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.383487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.479 [2024-11-02 12:07:42.383548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e1dcdcdc cdw11:dcdc0a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.383561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.479 #30 NEW cov: 11788 ft: 14039 corp: 14/300b lim: 40 exec/s: 0 rss: 68Mb L: 23/38 MS: 1 ChangeBinInt- 00:07:55.479 [2024-11-02 12:07:42.423463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc017f39 cdw11:ede7ce0d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.479 [2024-11-02 12:07:42.423488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.479 #31 NEW cov: 11788 ft: 14070 corp: 15/309b lim: 40 exec/s: 0 rss: 68Mb L: 9/38 MS: 1 PersAutoDict- DE: "\001\1779\355\347\316\015\342"- 00:07:55.737 [2024-11-02 12:07:42.463962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.737 [2024-11-02 12:07:42.463988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.737 [2024-11-02 12:07:42.464051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.737 [2024-11-02 12:07:42.464065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.737 [2024-11-02 12:07:42.464137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.737 [2024-11-02 12:07:42.464151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.737 [2024-11-02 12:07:42.464205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0faa8caa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.737 [2024-11-02 12:07:42.464218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.737 #32 NEW cov: 11788 ft: 14140 corp: 16/348b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 InsertByte- 00:07:55.737 [2024-11-02 12:07:42.503698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdc1c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.737 [2024-11-02 12:07:42.503725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.737 #33 NEW cov: 11788 ft: 14154 corp: 17/359b lim: 40 exec/s: 0 rss: 69Mb L: 11/39 MS: 1 ChangeBinInt- 00:07:55.737 [2024-11-02 12:07:42.543790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc000bdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.737 [2024-11-02 12:07:42.543815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.738 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.738 #34 NEW cov: 11811 ft: 14226 corp: 18/370b lim: 40 exec/s: 0 rss: 69Mb L: 11/39 MS: 1 ChangeBinInt- 00:07:55.738 [2024-11-02 12:07:42.583892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdddc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.583917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.738 #35 NEW cov: 11811 ft: 14259 corp: 19/379b lim: 40 exec/s: 0 rss: 69Mb L: 9/39 MS: 1 ChangeBit- 00:07:55.738 [2024-11-02 12:07:42.614005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.614031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.738 #36 NEW cov: 11811 ft: 14273 corp: 20/391b lim: 40 exec/s: 36 rss: 69Mb L: 12/39 MS: 1 ShuffleBytes- 00:07:55.738 [2024-11-02 12:07:42.654488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdc02 cdw11:000000dc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.654513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.738 [2024-11-02 12:07:42.654576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.654589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.738 [2024-11-02 12:07:42.654650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.654663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.738 [2024-11-02 12:07:42.654725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0faa8caa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.654739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.738 #37 NEW cov: 11811 ft: 14290 corp: 21/430b lim: 40 exec/s: 37 rss: 69Mb L: 39/39 MS: 1 CMP- DE: "\002\000\000\000"- 00:07:55.738 [2024-11-02 12:07:42.694655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.694679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.738 [2024-11-02 12:07:42.694742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:eaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.694755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.738 [2024-11-02 12:07:42.694817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.694831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.738 [2024-11-02 12:07:42.694894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0faa8caa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.738 [2024-11-02 12:07:42.694907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.996 #38 NEW cov: 11811 ft: 14305 corp: 22/469b lim: 40 exec/s: 38 rss: 69Mb L: 39/39 MS: 1 ChangeBit- 00:07:55.996 [2024-11-02 12:07:42.734842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00dcdcdc cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.734867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.734909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dcaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.734926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.734987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.735021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.735082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ff0faa8c cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.735096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.735158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaadc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.735171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.996 #39 NEW cov: 11811 ft: 14361 corp: 23/509b lim: 40 exec/s: 39 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:55.996 [2024-11-02 12:07:42.774855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.774880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.774939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.774953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.775013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.775025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.775085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0daaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.775098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.996 #40 NEW cov: 11811 ft: 14377 corp: 24/547b lim: 40 exec/s: 40 rss: 69Mb L: 38/40 MS: 1 ChangeBit- 00:07:55.996 [2024-11-02 12:07:42.814579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.814603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.996 #41 NEW cov: 11811 ft: 14434 corp: 25/560b lim: 40 exec/s: 41 rss: 69Mb L: 13/40 MS: 1 CrossOver- 00:07:55.996 [2024-11-02 12:07:42.844778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dc40dcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.844803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.996 [2024-11-02 12:07:42.844862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdc0a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.996 [2024-11-02 12:07:42.844875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.996 #42 NEW cov: 11811 ft: 14439 corp: 26/583b lim: 40 exec/s: 42 rss: 69Mb L: 23/40 MS: 1 ShuffleBytes- 00:07:55.997 [2024-11-02 12:07:42.884785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:27dcdcdc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.997 [2024-11-02 12:07:42.884810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.997 #43 NEW cov: 11811 ft: 14458 corp: 27/597b lim: 40 exec/s: 43 rss: 69Mb L: 14/40 MS: 1 InsertByte- 00:07:55.997 [2024-11-02 12:07:42.924944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc000bdc cdw11:dcdcdcd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.997 [2024-11-02 12:07:42.924968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.997 #44 NEW cov: 11811 ft: 14482 corp: 28/608b lim: 40 exec/s: 44 rss: 69Mb L: 11/40 MS: 1 ChangeBit- 00:07:55.997 [2024-11-02 12:07:42.965446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.997 [2024-11-02 12:07:42.965470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.997 [2024-11-02 12:07:42.965531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.997 [2024-11-02 12:07:42.965545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.997 [2024-11-02 12:07:42.965604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.997 [2024-11-02 12:07:42.965618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.997 [2024-11-02 12:07:42.965675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:aa5eaaaa cdw11:dcaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.997 [2024-11-02 12:07:42.965688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.255 #45 NEW cov: 11811 ft: 14516 corp: 29/646b lim: 40 exec/s: 45 rss: 69Mb L: 38/40 MS: 1 CrossOver- 00:07:56.255 [2024-11-02 12:07:43.005725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.005749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 [2024-11-02 12:07:43.005824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dcdca7a7 cdw11:a7a7a7a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.005838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.255 [2024-11-02 12:07:43.005896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:a7a7a7a7 cdw11:a7a7a7a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.005910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.255 [2024-11-02 12:07:43.005967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:a7a7a7a7 cdw11:a7a7a7a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.005980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.255 [2024-11-02 12:07:43.006046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:a7a7a7a7 cdw11:a7a70a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.006059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.255 #46 NEW cov: 11811 ft: 14563 corp: 30/686b lim: 40 exec/s: 46 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:56.255 [2024-11-02 12:07:43.045395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.045419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 [2024-11-02 12:07:43.045494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.045509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.255 #47 NEW cov: 11811 ft: 14582 corp: 31/709b lim: 40 exec/s: 47 rss: 69Mb L: 23/40 MS: 1 EraseBytes- 00:07:56.255 [2024-11-02 12:07:43.085414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:27dcdcdc cdw11:0000017f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.085440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 #48 NEW cov: 11811 ft: 14595 corp: 32/723b lim: 40 exec/s: 48 rss: 69Mb L: 14/40 MS: 1 PersAutoDict- DE: "\001\1779\355\347\316\015\342"- 00:07:56.255 [2024-11-02 12:07:43.125572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdc00 cdw11:000000dc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.125597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 #49 NEW cov: 11811 ft: 14606 corp: 33/737b lim: 40 exec/s: 49 rss: 69Mb L: 14/40 MS: 1 CrossOver- 00:07:56.255 [2024-11-02 12:07:43.165666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:9cdcdcdc cdw11:dcdcdc0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.165691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 #50 NEW cov: 11811 ft: 14608 corp: 34/746b lim: 40 exec/s: 50 rss: 69Mb L: 9/40 MS: 1 ChangeBit- 00:07:56.255 [2024-11-02 12:07:43.195998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc000bff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.196022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.255 [2024-11-02 12:07:43.196084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.196098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.255 [2024-11-02 12:07:43.196159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.255 [2024-11-02 12:07:43.196172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.255 #51 NEW cov: 11811 ft: 14801 corp: 35/775b lim: 40 exec/s: 51 rss: 69Mb L: 29/40 MS: 1 InsertRepeatedBytes- 00:07:56.514 [2024-11-02 12:07:43.235885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc017f39 cdw11:ffede7ce SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.235909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 #52 NEW cov: 11811 ft: 14807 corp: 36/785b lim: 40 exec/s: 52 rss: 69Mb L: 10/40 MS: 1 InsertByte- 00:07:56.514 [2024-11-02 12:07:43.275986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc01dc7f cdw11:39ede7ce SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.276017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 #53 NEW cov: 11811 ft: 14905 corp: 37/795b lim: 40 exec/s: 53 rss: 70Mb L: 10/40 MS: 1 CopyPart- 00:07:56.514 [2024-11-02 12:07:43.316078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcccdc0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.316103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 #54 NEW cov: 11811 ft: 14948 corp: 38/804b lim: 40 exec/s: 54 rss: 70Mb L: 9/40 MS: 1 ChangeBit- 00:07:56.514 [2024-11-02 12:07:43.346533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.346557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-11-02 12:07:43.346616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:eaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.346629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.514 [2024-11-02 12:07:43.346683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0faa8caa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.346696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.514 [2024-11-02 12:07:43.346749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0faa8caa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.346761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.514 #55 NEW cov: 11811 ft: 14963 corp: 39/843b lim: 40 exec/s: 55 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:07:56.514 [2024-11-02 12:07:43.386308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc000b25 cdw11:23dcdcd8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.386334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 #56 NEW cov: 11811 ft: 14974 corp: 40/854b lim: 40 exec/s: 56 rss: 70Mb L: 11/40 MS: 1 ChangeBinInt- 00:07:56.514 [2024-11-02 12:07:43.426763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dc40dcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.426789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-11-02 12:07:43.426853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdc0a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.426867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.514 [2024-11-02 12:07:43.426927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdc0a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.426941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.514 #57 NEW cov: 11811 ft: 14994 corp: 41/878b lim: 40 exec/s: 57 rss: 70Mb L: 24/40 MS: 1 CopyPart- 00:07:56.514 [2024-11-02 12:07:43.466558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc01ce0d cdw11:e77fdc39 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.514 [2024-11-02 12:07:43.466583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.773 #58 NEW cov: 11811 ft: 14999 corp: 42/888b lim: 40 exec/s: 58 rss: 70Mb L: 10/40 MS: 1 ShuffleBytes- 00:07:56.773 [2024-11-02 12:07:43.507194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc40dcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.507221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.773 [2024-11-02 12:07:43.507292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:dcdca7a7 cdw11:a7a7a7a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.507306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.773 [2024-11-02 12:07:43.507363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:a7a7a7a7 cdw11:a7a7a7a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.507377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.773 [2024-11-02 12:07:43.507433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:a7a7a7a7 cdw11:a7aaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.507447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.773 [2024-11-02 12:07:43.507504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:aaaaaaaa cdw11:aaaa0a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.507517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.773 #59 NEW cov: 11811 ft: 15050 corp: 43/928b lim: 40 exec/s: 59 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:07:56.773 [2024-11-02 12:07:43.547231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.547257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.773 [2024-11-02 12:07:43.547334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.547347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.773 [2024-11-02 12:07:43.547412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30aaaaaa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.547426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.773 [2024-11-02 12:07:43.547487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0daaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.547500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.773 #60 NEW cov: 11811 ft: 15077 corp: 44/966b lim: 40 exec/s: 60 rss: 70Mb L: 38/40 MS: 1 ChangeByte- 00:07:56.773 [2024-11-02 12:07:43.586873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.586897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.773 #61 NEW cov: 11811 ft: 15094 corp: 45/975b lim: 40 exec/s: 61 rss: 70Mb L: 9/40 MS: 1 CopyPart- 00:07:56.773 [2024-11-02 12:07:43.616960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:dc010de7 cdw11:7fce0de7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.773 [2024-11-02 12:07:43.616989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.773 #63 NEW cov: 11811 ft: 15105 corp: 46/985b lim: 40 exec/s: 31 rss: 70Mb L: 10/40 MS: 2 EraseBytes-CopyPart- 00:07:56.773 #63 DONE cov: 11811 ft: 15105 corp: 46/985b lim: 40 exec/s: 31 rss: 70Mb 00:07:56.773 ###### Recommended dictionary. ###### 00:07:56.773 "\377\017" # Uses: 0 00:07:56.773 "\001\1779\355\347\316\015\342" # Uses: 2 00:07:56.773 "\002\000\000\000" # Uses: 0 00:07:56.773 ###### End of recommended dictionary. ###### 00:07:56.773 Done 63 runs in 2 second(s) 00:07:57.033 12:07:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:57.033 12:07:43 -- ../common.sh@72 -- # (( i++ )) 00:07:57.033 12:07:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.033 12:07:43 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:57.033 12:07:43 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:57.033 12:07:43 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.033 12:07:43 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.033 12:07:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:57.033 12:07:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:57.033 12:07:43 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:57.033 12:07:43 -- nvmf/run.sh@29 -- # port=4411 00:07:57.033 12:07:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:57.033 12:07:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:57.033 12:07:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.033 12:07:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:57.033 [2024-11-02 12:07:43.800233] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:57.033 [2024-11-02 12:07:43.800327] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1148831 ] 00:07:57.033 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.292 [2024-11-02 12:07:44.056757] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.292 [2024-11-02 12:07:44.086329] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.292 [2024-11-02 12:07:44.086449] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.292 [2024-11-02 12:07:44.137741] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.292 [2024-11-02 12:07:44.154158] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:57.292 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.292 INFO: Seed: 3699834854 00:07:57.292 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:57.292 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:57.292 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:57.292 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.292 #2 INITED exec/s: 0 rss: 59Mb 00:07:57.292 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.292 This may also happen if the target rejected all inputs we tried so far 00:07:57.292 [2024-11-02 12:07:44.209727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.292 [2024-11-02 12:07:44.209755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.292 [2024-11-02 12:07:44.209815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.292 [2024-11-02 12:07:44.209829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.292 [2024-11-02 12:07:44.209880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.292 [2024-11-02 12:07:44.209893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.292 [2024-11-02 12:07:44.209947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.292 [2024-11-02 12:07:44.209959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.635 NEW_FUNC[1/671]: 0x45ffb8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:57.635 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.635 #12 NEW cov: 11596 ft: 11597 corp: 2/36b lim: 40 exec/s: 0 rss: 67Mb L: 35/35 MS: 5 ChangeBit-ChangeByte-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:57.635 [2024-11-02 12:07:44.510490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.635 [2024-11-02 12:07:44.510521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.635 [2024-11-02 12:07:44.510594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.635 [2024-11-02 12:07:44.510608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.635 [2024-11-02 12:07:44.510662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.635 [2024-11-02 12:07:44.510676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.635 [2024-11-02 12:07:44.510729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.635 [2024-11-02 12:07:44.510742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.635 #18 NEW cov: 11709 ft: 12089 corp: 3/71b lim: 40 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:57.635 [2024-11-02 12:07:44.560111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.635 [2024-11-02 12:07:44.560138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.940 #22 NEW cov: 11715 ft: 13113 corp: 4/85b lim: 40 exec/s: 0 rss: 67Mb L: 14/35 MS: 4 ShuffleBytes-CopyPart-ChangeBinInt-CrossOver- 00:07:57.940 [2024-11-02 12:07:44.600679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:58000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.600704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.600762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.600775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.600840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.600853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.600908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.600922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.940 #23 NEW cov: 11800 ft: 13340 corp: 5/120b lim: 40 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:57.940 [2024-11-02 12:07:44.650825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.650850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.650923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.650937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.650997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.651021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.651096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.651109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.940 #24 NEW cov: 11800 ft: 13398 corp: 6/157b lim: 40 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:57.940 [2024-11-02 12:07:44.690432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.690457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.940 #25 NEW cov: 11800 ft: 13510 corp: 7/167b lim: 40 exec/s: 0 rss: 67Mb L: 10/37 MS: 1 EraseBytes- 00:07:57.940 [2024-11-02 12:07:44.731088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:58000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.731113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.731187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.731202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.731257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.731270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.731325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:23000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.731338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.940 #26 NEW cov: 11800 ft: 13575 corp: 8/202b lim: 40 exec/s: 0 rss: 67Mb L: 35/37 MS: 1 ChangeBinInt- 00:07:57.940 [2024-11-02 12:07:44.771183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.771208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.771281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.771295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.771353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.771366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.940 [2024-11-02 12:07:44.771424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.940 [2024-11-02 12:07:44.771437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.940 #27 NEW cov: 11800 ft: 13610 corp: 9/236b lim: 40 exec/s: 0 rss: 67Mb L: 34/37 MS: 1 EraseBytes- 00:07:57.941 [2024-11-02 12:07:44.811107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0000d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.811131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.941 [2024-11-02 12:07:44.811205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:d9d9d9d9 cdw11:d9d9d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.811219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.941 [2024-11-02 12:07:44.811278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:d9d9d900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.811291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.941 #28 NEW cov: 11800 ft: 13865 corp: 10/263b lim: 40 exec/s: 0 rss: 67Mb L: 27/37 MS: 1 InsertRepeatedBytes- 00:07:57.941 [2024-11-02 12:07:44.851441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000040 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.851467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.941 [2024-11-02 12:07:44.851541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.851555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.941 [2024-11-02 12:07:44.851612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.851625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.941 [2024-11-02 12:07:44.851678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.851691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.941 #29 NEW cov: 11800 ft: 13886 corp: 11/298b lim: 40 exec/s: 0 rss: 67Mb L: 35/37 MS: 1 ChangeBit- 00:07:57.941 [2024-11-02 12:07:44.891093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.941 [2024-11-02 12:07:44.891119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.941 #30 NEW cov: 11800 ft: 13988 corp: 12/312b lim: 40 exec/s: 0 rss: 67Mb L: 14/37 MS: 1 ChangeBinInt- 00:07:58.199 [2024-11-02 12:07:44.931319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:44.931345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:44.931401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:44.931415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.199 #31 NEW cov: 11800 ft: 14233 corp: 13/334b lim: 40 exec/s: 0 rss: 67Mb L: 22/37 MS: 1 CrossOver- 00:07:58.199 [2024-11-02 12:07:44.971719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000040 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:44.971745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:44.971821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:44.971836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:44.971893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:44.971906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:44.971959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:44.971973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.199 #32 NEW cov: 11800 ft: 14266 corp: 14/369b lim: 40 exec/s: 0 rss: 68Mb L: 35/37 MS: 1 ShuffleBytes- 00:07:58.199 [2024-11-02 12:07:45.011579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:45.011604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:45.011663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:45.011677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.199 #33 NEW cov: 11800 ft: 14302 corp: 15/391b lim: 40 exec/s: 0 rss: 68Mb L: 22/37 MS: 1 ShuffleBytes- 00:07:58.199 [2024-11-02 12:07:45.052113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:45.052139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:45.052196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:45.052212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:45.052283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:45.052297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:45.052352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:45.052365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.199 [2024-11-02 12:07:45.052420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.199 [2024-11-02 12:07:45.052434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.200 #34 NEW cov: 11800 ft: 14445 corp: 16/431b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:07:58.200 [2024-11-02 12:07:45.092083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.092109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.200 [2024-11-02 12:07:45.092166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00850000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.092179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.200 [2024-11-02 12:07:45.092237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.092250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.200 [2024-11-02 12:07:45.092307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.092321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.200 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.200 #35 NEW cov: 11823 ft: 14501 corp: 17/466b lim: 40 exec/s: 0 rss: 68Mb L: 35/40 MS: 1 ChangeByte- 00:07:58.200 [2024-11-02 12:07:45.132193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000040 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.132219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.200 [2024-11-02 12:07:45.132278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.132291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.200 [2024-11-02 12:07:45.132347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.132360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.200 [2024-11-02 12:07:45.132413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.132429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.200 #39 NEW cov: 11823 ft: 14514 corp: 18/502b lim: 40 exec/s: 0 rss: 68Mb L: 36/40 MS: 4 InsertByte-EraseBytes-ChangeByte-CrossOver- 00:07:58.200 [2024-11-02 12:07:45.161802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.200 [2024-11-02 12:07:45.161827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.458 #40 NEW cov: 11823 ft: 14568 corp: 19/516b lim: 40 exec/s: 0 rss: 68Mb L: 14/40 MS: 1 CrossOver- 00:07:58.458 [2024-11-02 12:07:45.202549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:08000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.458 [2024-11-02 12:07:45.202574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.202634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.202647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.202705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.202718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.202776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.202790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.202849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.202862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.459 #41 NEW cov: 11823 ft: 14598 corp: 20/556b lim: 40 exec/s: 41 rss: 68Mb L: 40/40 MS: 1 ChangeBit- 00:07:58.459 [2024-11-02 12:07:45.242390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0000d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.242415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.242474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:d9d9d9d9 cdw11:d9d9d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.242488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.242546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:d9d9d900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.242560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.459 #42 NEW cov: 11823 ft: 14621 corp: 21/581b lim: 40 exec/s: 42 rss: 68Mb L: 25/40 MS: 1 EraseBytes- 00:07:58.459 [2024-11-02 12:07:45.282638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.282663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.282726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.282740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.282797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:87878779 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.282811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.282867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.282880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.459 #43 NEW cov: 11823 ft: 14643 corp: 22/619b lim: 40 exec/s: 43 rss: 68Mb L: 38/40 MS: 1 InsertByte- 00:07:58.459 [2024-11-02 12:07:45.322464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.322489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.322549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.322562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.459 #44 NEW cov: 11823 ft: 14726 corp: 23/641b lim: 40 exec/s: 44 rss: 68Mb L: 22/40 MS: 1 ShuffleBytes- 00:07:58.459 [2024-11-02 12:07:45.362394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.362419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.459 #45 NEW cov: 11823 ft: 14746 corp: 24/651b lim: 40 exec/s: 45 rss: 68Mb L: 10/40 MS: 1 ChangeBit- 00:07:58.459 [2024-11-02 12:07:45.403000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000040 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.403025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.403099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000b100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.403111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.403173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.403187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.459 [2024-11-02 12:07:45.403245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.459 [2024-11-02 12:07:45.403258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.459 #46 NEW cov: 11823 ft: 14748 corp: 25/687b lim: 40 exec/s: 46 rss: 68Mb L: 36/40 MS: 1 InsertByte- 00:07:58.718 [2024-11-02 12:07:45.442843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.442872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.442929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00850000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.442942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.718 #47 NEW cov: 11823 ft: 14765 corp: 26/709b lim: 40 exec/s: 47 rss: 68Mb L: 22/40 MS: 1 EraseBytes- 00:07:58.718 [2024-11-02 12:07:45.482978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0000010e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.483009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.483069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.483083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.718 #48 NEW cov: 11823 ft: 14801 corp: 27/725b lim: 40 exec/s: 48 rss: 68Mb L: 16/40 MS: 1 CMP- DE: "\001\016"- 00:07:58.718 [2024-11-02 12:07:45.523564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:58000000 cdw11:0000bcbc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.523589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.523645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:bcbcbc00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.523658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.523717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.523730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.523789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.523802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.523862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.523875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.718 #49 NEW cov: 11823 ft: 14836 corp: 28/765b lim: 40 exec/s: 49 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:58.718 [2024-11-02 12:07:45.562968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.562999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.718 #50 NEW cov: 11823 ft: 14847 corp: 29/775b lim: 40 exec/s: 50 rss: 68Mb L: 10/40 MS: 1 ShuffleBytes- 00:07:58.718 [2024-11-02 12:07:45.603624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0000d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.603650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.603708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:d9d92e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.603725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.603782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2ed9 cdw11:d9d9d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.603795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.603850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:d9d9d9d9 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.603864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.718 #51 NEW cov: 11823 ft: 14874 corp: 30/811b lim: 40 exec/s: 51 rss: 68Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:07:58.718 [2024-11-02 12:07:45.643693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.643718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.643794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.643808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.643862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.643875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.643932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:87600087 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.643945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.718 #52 NEW cov: 11823 ft: 14881 corp: 31/848b lim: 40 exec/s: 52 rss: 68Mb L: 37/40 MS: 1 CrossOver- 00:07:58.718 [2024-11-02 12:07:45.683833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.683858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.683934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.683948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.718 [2024-11-02 12:07:45.684007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.718 [2024-11-02 12:07:45.684021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.719 [2024-11-02 12:07:45.684075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.719 [2024-11-02 12:07:45.684088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.977 #53 NEW cov: 11823 ft: 14919 corp: 32/886b lim: 40 exec/s: 53 rss: 68Mb L: 38/40 MS: 1 CrossOver- 00:07:58.977 [2024-11-02 12:07:45.723809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:6000010e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.723836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.977 [2024-11-02 12:07:45.723897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000085 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.723910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.977 [2024-11-02 12:07:45.723967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.723981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.977 #54 NEW cov: 11823 ft: 14939 corp: 33/910b lim: 40 exec/s: 54 rss: 68Mb L: 24/40 MS: 1 PersAutoDict- DE: "\001\016"- 00:07:58.977 [2024-11-02 12:07:45.764104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.764129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.977 [2024-11-02 12:07:45.764202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.764216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.977 [2024-11-02 12:07:45.764270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.764283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.977 [2024-11-02 12:07:45.764339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.764352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.977 #55 NEW cov: 11823 ft: 14955 corp: 34/945b lim: 40 exec/s: 55 rss: 69Mb L: 35/40 MS: 1 ShuffleBytes- 00:07:58.977 [2024-11-02 12:07:45.804187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000040 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.977 [2024-11-02 12:07:45.804212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.977 [2024-11-02 12:07:45.804270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.804284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.804340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.804353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.804409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.804422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.978 #56 NEW cov: 11823 ft: 14989 corp: 35/979b lim: 40 exec/s: 56 rss: 69Mb L: 34/40 MS: 1 EraseBytes- 00:07:58.978 [2024-11-02 12:07:45.844515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.844539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.844615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.844629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.844686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.844699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.844754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:d9d9d900 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.844768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.844824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.844837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.978 #57 NEW cov: 11823 ft: 15060 corp: 36/1019b lim: 40 exec/s: 57 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:58.978 [2024-11-02 12:07:45.884294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.884319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.884379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:87878787 cdw11:87878787 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.884393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.884452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:87878779 cdw11:87878760 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.884465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.978 #58 NEW cov: 11823 ft: 15086 corp: 37/1045b lim: 40 exec/s: 58 rss: 69Mb L: 26/40 MS: 1 CrossOver- 00:07:58.978 [2024-11-02 12:07:45.924547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:0008d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.924572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.924645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:d9d92e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.924658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.924712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2ed9 cdw11:d9d9d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.924725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.978 [2024-11-02 12:07:45.924778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:d9d9d9d9 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.978 [2024-11-02 12:07:45.924795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.978 #59 NEW cov: 11823 ft: 15110 corp: 38/1081b lim: 40 exec/s: 59 rss: 69Mb L: 36/40 MS: 1 ChangeBinInt- 00:07:59.237 [2024-11-02 12:07:45.964670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:60000000 cdw11:00000740 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:45.964695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:45.964765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:45.964779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:45.964834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:45.964847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:45.964899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:45.964913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.237 #60 NEW cov: 11823 ft: 15124 corp: 39/1116b lim: 40 exec/s: 60 rss: 69Mb L: 35/40 MS: 1 ChangeBinInt- 00:07:59.237 [2024-11-02 12:07:46.004853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:01000004 cdw11:00000040 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.004878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.004950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000b100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.004964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.005022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.005035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.005090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.005103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.237 #61 NEW cov: 11823 ft: 15140 corp: 40/1152b lim: 40 exec/s: 61 rss: 69Mb L: 36/40 MS: 1 CMP- DE: "\001\000\000\004"- 00:07:59.237 [2024-11-02 12:07:46.044606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000016 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.044631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.044687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.044701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.237 #62 NEW cov: 11823 ft: 15158 corp: 41/1174b lim: 40 exec/s: 62 rss: 69Mb L: 22/40 MS: 1 ChangeBinInt- 00:07:59.237 [2024-11-02 12:07:46.084852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.084877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.084938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:19d9d9d9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.084952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.085011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:d9d9d900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.085040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.237 #63 NEW cov: 11823 ft: 15167 corp: 42/1199b lim: 40 exec/s: 63 rss: 69Mb L: 25/40 MS: 1 ChangeBinInt- 00:07:59.237 [2024-11-02 12:07:46.125118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:58000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.125143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.125215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00f7ffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.125230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.125284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.125297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.125351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:23000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.125364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.237 #64 NEW cov: 11823 ft: 15199 corp: 43/1234b lim: 40 exec/s: 64 rss: 69Mb L: 35/40 MS: 1 ChangeBinInt- 00:07:59.237 [2024-11-02 12:07:46.165413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:58000000 cdw11:0000bcbc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.237 [2024-11-02 12:07:46.165437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.237 [2024-11-02 12:07:46.165507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:bcbcbc00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.238 [2024-11-02 12:07:46.165521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.238 [2024-11-02 12:07:46.165576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.238 [2024-11-02 12:07:46.165590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.238 [2024-11-02 12:07:46.165647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.238 [2024-11-02 12:07:46.165660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.238 [2024-11-02 12:07:46.165714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.238 [2024-11-02 12:07:46.165731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.238 #65 NEW cov: 11823 ft: 15213 corp: 44/1274b lim: 40 exec/s: 32 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:59.238 #65 DONE cov: 11823 ft: 15213 corp: 44/1274b lim: 40 exec/s: 32 rss: 69Mb 00:07:59.238 ###### Recommended dictionary. ###### 00:07:59.238 "\001\016" # Uses: 1 00:07:59.238 "\001\000\000\004" # Uses: 0 00:07:59.238 ###### End of recommended dictionary. ###### 00:07:59.238 Done 65 runs in 2 second(s) 00:07:59.496 12:07:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:59.496 12:07:46 -- ../common.sh@72 -- # (( i++ )) 00:07:59.496 12:07:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.496 12:07:46 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:59.496 12:07:46 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:59.496 12:07:46 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.496 12:07:46 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.496 12:07:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:59.496 12:07:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:59.496 12:07:46 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:59.496 12:07:46 -- nvmf/run.sh@29 -- # port=4412 00:07:59.496 12:07:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:59.496 12:07:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:59.496 12:07:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.496 12:07:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:59.496 [2024-11-02 12:07:46.348246] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:07:59.496 [2024-11-02 12:07:46.348338] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1149188 ] 00:07:59.496 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.755 [2024-11-02 12:07:46.604887] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.755 [2024-11-02 12:07:46.631288] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:59.755 [2024-11-02 12:07:46.631423] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.755 [2024-11-02 12:07:46.682869] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.755 [2024-11-02 12:07:46.699268] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:59.755 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.755 INFO: Seed: 1948873929 00:08:00.013 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:00.013 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:00.013 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:00.013 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.013 #2 INITED exec/s: 0 rss: 59Mb 00:08:00.013 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.013 This may also happen if the target rejected all inputs we tried so far 00:08:00.013 [2024-11-02 12:07:46.754417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.013 [2024-11-02 12:07:46.754445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.271 NEW_FUNC[1/671]: 0x461d28 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:00.271 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.271 #4 NEW cov: 11594 ft: 11595 corp: 2/15b lim: 40 exec/s: 0 rss: 67Mb L: 14/14 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:00.271 [2024-11-02 12:07:47.065228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.271 [2024-11-02 12:07:47.065275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.271 #10 NEW cov: 11707 ft: 12145 corp: 3/29b lim: 40 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 CopyPart- 00:08:00.271 [2024-11-02 12:07:47.115219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.272 [2024-11-02 12:07:47.115244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.272 #11 NEW cov: 11713 ft: 12515 corp: 4/43b lim: 40 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 CopyPart- 00:08:00.272 [2024-11-02 12:07:47.155338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.272 [2024-11-02 12:07:47.155362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.272 #17 NEW cov: 11798 ft: 12806 corp: 5/57b lim: 40 exec/s: 0 rss: 67Mb L: 14/14 MS: 1 CrossOver- 00:08:00.272 [2024-11-02 12:07:47.195422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffff03 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.272 [2024-11-02 12:07:47.195447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.272 #22 NEW cov: 11798 ft: 12937 corp: 6/68b lim: 40 exec/s: 0 rss: 67Mb L: 11/14 MS: 5 CopyPart-InsertByte-InsertByte-ChangeBinInt-CMP- DE: "\377\377\377\377\377\377\003\000"- 00:08:00.272 [2024-11-02 12:07:47.235572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.272 [2024-11-02 12:07:47.235597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.530 #23 NEW cov: 11798 ft: 13029 corp: 7/83b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 InsertByte- 00:08:00.530 [2024-11-02 12:07:47.275650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.530 [2024-11-02 12:07:47.275675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.530 #24 NEW cov: 11798 ft: 13139 corp: 8/94b lim: 40 exec/s: 0 rss: 67Mb L: 11/15 MS: 1 EraseBytes- 00:08:00.530 [2024-11-02 12:07:47.315776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a28ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.530 [2024-11-02 12:07:47.315801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.530 #25 NEW cov: 11798 ft: 13214 corp: 9/109b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 ShuffleBytes- 00:08:00.530 [2024-11-02 12:07:47.355876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.530 [2024-11-02 12:07:47.355900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.530 #26 NEW cov: 11798 ft: 13276 corp: 10/122b lim: 40 exec/s: 0 rss: 67Mb L: 13/15 MS: 1 EraseBytes- 00:08:00.530 [2024-11-02 12:07:47.396042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.530 [2024-11-02 12:07:47.396069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.530 #27 NEW cov: 11798 ft: 13300 corp: 11/131b lim: 40 exec/s: 0 rss: 67Mb L: 9/15 MS: 1 EraseBytes- 00:08:00.530 [2024-11-02 12:07:47.436163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.530 [2024-11-02 12:07:47.436188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.530 #28 NEW cov: 11798 ft: 13338 corp: 12/145b lim: 40 exec/s: 0 rss: 67Mb L: 14/15 MS: 1 ShuffleBytes- 00:08:00.530 [2024-11-02 12:07:47.476252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffefff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.530 [2024-11-02 12:07:47.476276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.530 #34 NEW cov: 11798 ft: 13459 corp: 13/159b lim: 40 exec/s: 0 rss: 67Mb L: 14/15 MS: 1 ChangeBit- 00:08:00.789 [2024-11-02 12:07:47.516340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affef cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.516365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.789 #35 NEW cov: 11798 ft: 13490 corp: 14/170b lim: 40 exec/s: 0 rss: 67Mb L: 11/15 MS: 1 ChangeBit- 00:08:00.789 [2024-11-02 12:07:47.556461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:31280aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.556485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.789 #36 NEW cov: 11798 ft: 13520 corp: 15/182b lim: 40 exec/s: 0 rss: 67Mb L: 12/15 MS: 1 InsertByte- 00:08:00.789 [2024-11-02 12:07:47.596602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff28ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.596627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.789 #40 NEW cov: 11798 ft: 13573 corp: 16/197b lim: 40 exec/s: 0 rss: 67Mb L: 15/15 MS: 4 CrossOver-ChangeBit-CrossOver-CrossOver- 00:08:00.789 [2024-11-02 12:07:47.636859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.636886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.789 [2024-11-02 12:07:47.636943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff7aff cdw11:ffff27ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.636957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.789 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.789 #41 NEW cov: 11821 ft: 14320 corp: 17/213b lim: 40 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertByte- 00:08:00.789 [2024-11-02 12:07:47.676801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.676827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.789 #42 NEW cov: 11821 ft: 14329 corp: 18/228b lim: 40 exec/s: 0 rss: 68Mb L: 15/16 MS: 1 EraseBytes- 00:08:00.789 [2024-11-02 12:07:47.716917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.716945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.789 #43 NEW cov: 11821 ft: 14374 corp: 19/242b lim: 40 exec/s: 0 rss: 68Mb L: 14/16 MS: 1 ChangeBit- 00:08:00.789 [2024-11-02 12:07:47.757046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.789 [2024-11-02 12:07:47.757070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.047 #49 NEW cov: 11821 ft: 14409 corp: 20/255b lim: 40 exec/s: 49 rss: 68Mb L: 13/16 MS: 1 CopyPart- 00:08:01.047 [2024-11-02 12:07:47.797150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.047 [2024-11-02 12:07:47.797175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.047 #50 NEW cov: 11821 ft: 14423 corp: 21/266b lim: 40 exec/s: 50 rss: 68Mb L: 11/16 MS: 1 ChangeBit- 00:08:01.047 [2024-11-02 12:07:47.827427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:47.827451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.048 [2024-11-02 12:07:47.827526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bfff7aff cdw11:ffff27ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:47.827540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.048 #51 NEW cov: 11821 ft: 14437 corp: 22/282b lim: 40 exec/s: 51 rss: 68Mb L: 16/16 MS: 1 ChangeBit- 00:08:01.048 [2024-11-02 12:07:47.867366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a28f756 cdw11:ffeaeaea SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:47.867390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.048 #55 NEW cov: 11821 ft: 14452 corp: 23/292b lim: 40 exec/s: 55 rss: 68Mb L: 10/16 MS: 4 CrossOver-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:01.048 [2024-11-02 12:07:47.907483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffff77ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:47.907507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.048 #56 NEW cov: 11821 ft: 14461 corp: 24/307b lim: 40 exec/s: 56 rss: 68Mb L: 15/16 MS: 1 ChangeByte- 00:08:01.048 [2024-11-02 12:07:47.937750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28020000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:47.937774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.048 [2024-11-02 12:07:47.937848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:47.937861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.048 #57 NEW cov: 11821 ft: 14478 corp: 25/325b lim: 40 exec/s: 57 rss: 68Mb L: 18/18 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:01.048 [2024-11-02 12:07:47.977719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:02ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:47.977744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.048 #58 NEW cov: 11821 ft: 14493 corp: 26/339b lim: 40 exec/s: 58 rss: 68Mb L: 14/18 MS: 1 InsertByte- 00:08:01.048 [2024-11-02 12:07:48.017806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fffe27ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.048 [2024-11-02 12:07:48.017832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.306 #59 NEW cov: 11821 ft: 14521 corp: 27/347b lim: 40 exec/s: 59 rss: 68Mb L: 8/18 MS: 1 EraseBytes- 00:08:01.306 [2024-11-02 12:07:48.057951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.057977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.306 #60 NEW cov: 11821 ft: 14550 corp: 28/361b lim: 40 exec/s: 60 rss: 68Mb L: 14/18 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:01.306 [2024-11-02 12:07:48.088086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a28f756 cdw11:ffea8fea SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.088111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.306 #61 NEW cov: 11821 ft: 14602 corp: 29/371b lim: 40 exec/s: 61 rss: 68Mb L: 10/18 MS: 1 ChangeByte- 00:08:01.306 [2024-11-02 12:07:48.128220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2428ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.128246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.306 #62 NEW cov: 11821 ft: 14662 corp: 30/386b lim: 40 exec/s: 62 rss: 68Mb L: 15/18 MS: 1 InsertByte- 00:08:01.306 [2024-11-02 12:07:48.168297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:fdffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.168321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.306 #63 NEW cov: 11821 ft: 14676 corp: 31/400b lim: 40 exec/s: 63 rss: 68Mb L: 14/18 MS: 1 ChangeBit- 00:08:01.306 [2024-11-02 12:07:48.198674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.198699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.306 [2024-11-02 12:07:48.198776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.198790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.306 [2024-11-02 12:07:48.198848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.198862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.306 #64 NEW cov: 11821 ft: 14939 corp: 32/425b lim: 40 exec/s: 64 rss: 69Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:01.306 [2024-11-02 12:07:48.238641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:24e528ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.238665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.306 [2024-11-02 12:07:48.238735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.238749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.306 #65 NEW cov: 11821 ft: 14951 corp: 33/441b lim: 40 exec/s: 65 rss: 69Mb L: 16/25 MS: 1 InsertByte- 00:08:01.306 [2024-11-02 12:07:48.278609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.306 [2024-11-02 12:07:48.278633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.565 #66 NEW cov: 11821 ft: 14959 corp: 34/449b lim: 40 exec/s: 66 rss: 69Mb L: 8/25 MS: 1 EraseBytes- 00:08:01.565 [2024-11-02 12:07:48.318711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.318736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.565 #67 NEW cov: 11821 ft: 14972 corp: 35/463b lim: 40 exec/s: 67 rss: 69Mb L: 14/25 MS: 1 CopyPart- 00:08:01.565 [2024-11-02 12:07:48.358807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.358833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.565 #68 NEW cov: 11821 ft: 15034 corp: 36/473b lim: 40 exec/s: 68 rss: 69Mb L: 10/25 MS: 1 EraseBytes- 00:08:01.565 [2024-11-02 12:07:48.388947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affef cdw11:f7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.388972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.565 #69 NEW cov: 11821 ft: 15042 corp: 37/484b lim: 40 exec/s: 69 rss: 69Mb L: 11/25 MS: 1 ChangeBit- 00:08:01.565 [2024-11-02 12:07:48.429094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.429118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.565 #71 NEW cov: 11821 ft: 15043 corp: 38/495b lim: 40 exec/s: 71 rss: 69Mb L: 11/25 MS: 2 CrossOver-PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:01.565 [2024-11-02 12:07:48.469344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28020000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.469369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.565 [2024-11-02 12:07:48.469422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.469437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.565 #72 NEW cov: 11821 ft: 15047 corp: 39/513b lim: 40 exec/s: 72 rss: 69Mb L: 18/25 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:01.565 [2024-11-02 12:07:48.509284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.509309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.565 #73 NEW cov: 11821 ft: 15056 corp: 40/524b lim: 40 exec/s: 73 rss: 69Mb L: 11/25 MS: 1 ChangeBinInt- 00:08:01.565 [2024-11-02 12:07:48.539377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.565 [2024-11-02 12:07:48.539403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.823 #74 NEW cov: 11821 ft: 15069 corp: 41/535b lim: 40 exec/s: 74 rss: 69Mb L: 11/25 MS: 1 ShuffleBytes- 00:08:01.823 [2024-11-02 12:07:48.569424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:28ffffff cdw11:ffffff03 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.823 [2024-11-02 12:07:48.569448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.824 #75 NEW cov: 11821 ft: 15085 corp: 42/549b lim: 40 exec/s: 75 rss: 69Mb L: 14/25 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\003\000"- 00:08:01.824 [2024-11-02 12:07:48.609575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:31280aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.824 [2024-11-02 12:07:48.609600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.824 #76 NEW cov: 11821 ft: 15086 corp: 43/562b lim: 40 exec/s: 76 rss: 69Mb L: 13/25 MS: 1 InsertByte- 00:08:01.824 [2024-11-02 12:07:48.649673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff28ffff cdw11:ff28ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.824 [2024-11-02 12:07:48.649699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.824 #78 NEW cov: 11821 ft: 15089 corp: 44/572b lim: 40 exec/s: 78 rss: 69Mb L: 10/25 MS: 2 CrossOver-CopyPart- 00:08:01.824 [2024-11-02 12:07:48.689815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280a0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.824 [2024-11-02 12:07:48.689840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.824 #82 NEW cov: 11821 ft: 15148 corp: 45/586b lim: 40 exec/s: 82 rss: 70Mb L: 14/25 MS: 4 EraseBytes-CrossOver-ChangeBit-PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:01.824 [2024-11-02 12:07:48.730110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:280affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.824 [2024-11-02 12:07:48.730135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.824 [2024-11-02 12:07:48.730190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff7a28 cdw11:0aff27ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.824 [2024-11-02 12:07:48.730203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.824 #83 NEW cov: 11821 ft: 15156 corp: 46/602b lim: 40 exec/s: 41 rss: 70Mb L: 16/25 MS: 1 CopyPart- 00:08:01.824 #83 DONE cov: 11821 ft: 15156 corp: 46/602b lim: 40 exec/s: 41 rss: 70Mb 00:08:01.824 ###### Recommended dictionary. ###### 00:08:01.824 "\377\377\377\377\377\377\003\000" # Uses: 1 00:08:01.824 "\002\000\000\000" # Uses: 1 00:08:01.824 "\000\000\000\000\000\000\000\000" # Uses: 2 00:08:01.824 ###### End of recommended dictionary. ###### 00:08:01.824 Done 83 runs in 2 second(s) 00:08:02.083 12:07:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:02.083 12:07:48 -- ../common.sh@72 -- # (( i++ )) 00:08:02.083 12:07:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.083 12:07:48 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:02.083 12:07:48 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:02.083 12:07:48 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.083 12:07:48 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.083 12:07:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:02.083 12:07:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:02.083 12:07:48 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:02.083 12:07:48 -- nvmf/run.sh@29 -- # port=4413 00:08:02.083 12:07:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:02.083 12:07:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:02.083 12:07:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.083 12:07:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:02.083 [2024-11-02 12:07:48.907865] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:02.083 [2024-11-02 12:07:48.907960] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1149736 ] 00:08:02.083 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.341 [2024-11-02 12:07:49.162097] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.341 [2024-11-02 12:07:49.191488] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.341 [2024-11-02 12:07:49.191621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.341 [2024-11-02 12:07:49.242904] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.341 [2024-11-02 12:07:49.259285] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:02.341 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.341 INFO: Seed: 214885294 00:08:02.341 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:02.341 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:02.341 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:02.341 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.341 #2 INITED exec/s: 0 rss: 59Mb 00:08:02.341 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.341 This may also happen if the target rejected all inputs we tried so far 00:08:02.341 [2024-11-02 12:07:49.303835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:27535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.341 [2024-11-02 12:07:49.303867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.865 NEW_FUNC[1/670]: 0x4638f8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:02.865 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.865 #6 NEW cov: 11566 ft: 11530 corp: 2/14b lim: 40 exec/s: 0 rss: 67Mb L: 13/13 MS: 4 CrossOver-ChangeByte-EraseBytes-InsertRepeatedBytes- 00:08:02.865 [2024-11-02 12:07:49.624599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.865 [2024-11-02 12:07:49.624637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.865 #15 NEW cov: 11695 ft: 12087 corp: 3/25b lim: 40 exec/s: 0 rss: 68Mb L: 11/13 MS: 4 InsertByte-InsertByte-ChangeByte-CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:02.865 [2024-11-02 12:07:49.674630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:27535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.865 [2024-11-02 12:07:49.674661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.865 #16 NEW cov: 11701 ft: 12310 corp: 4/39b lim: 40 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 InsertByte- 00:08:02.865 [2024-11-02 12:07:49.734770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff030000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.865 [2024-11-02 12:07:49.734809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.865 #18 NEW cov: 11786 ft: 12568 corp: 5/48b lim: 40 exec/s: 0 rss: 68Mb L: 9/14 MS: 2 CrossOver-CMP- DE: "\377\003\000\000\000\000\000\000"- 00:08:02.865 [2024-11-02 12:07:49.784923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.865 [2024-11-02 12:07:49.784953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.865 #19 NEW cov: 11786 ft: 12676 corp: 6/57b lim: 40 exec/s: 0 rss: 68Mb L: 9/14 MS: 1 ChangeBinInt- 00:08:03.125 [2024-11-02 12:07:49.845085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:27535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.125 [2024-11-02 12:07:49.845116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.125 #20 NEW cov: 11786 ft: 12776 corp: 7/70b lim: 40 exec/s: 0 rss: 68Mb L: 13/14 MS: 1 ShuffleBytes- 00:08:03.125 [2024-11-02 12:07:49.895181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7aff00ff cdw11:03000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.125 [2024-11-02 12:07:49.895212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.125 #24 NEW cov: 11786 ft: 12829 corp: 8/84b lim: 40 exec/s: 0 rss: 68Mb L: 14/14 MS: 4 ChangeByte-ChangeByte-CrossOver-PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:08:03.125 [2024-11-02 12:07:49.945300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7b4aff03 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.125 [2024-11-02 12:07:49.945331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.125 #29 NEW cov: 11786 ft: 12928 corp: 9/94b lim: 40 exec/s: 0 rss: 68Mb L: 10/14 MS: 5 CrossOver-InsertByte-ChangeBit-ChangeBit-PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:08:03.125 [2024-11-02 12:07:49.995492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.125 [2024-11-02 12:07:49.995523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.125 #30 NEW cov: 11786 ft: 12968 corp: 10/105b lim: 40 exec/s: 0 rss: 68Mb L: 11/14 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:03.125 [2024-11-02 12:07:50.065686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ff03ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.125 [2024-11-02 12:07:50.065726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.384 #31 NEW cov: 11786 ft: 13070 corp: 11/116b lim: 40 exec/s: 0 rss: 68Mb L: 11/14 MS: 1 ChangeBinInt- 00:08:03.384 [2024-11-02 12:07:50.135933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.384 [2024-11-02 12:07:50.135971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.384 [2024-11-02 12:07:50.136011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff3f0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.384 [2024-11-02 12:07:50.136043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.384 #32 NEW cov: 11786 ft: 13428 corp: 12/136b lim: 40 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CopyPart- 00:08:03.384 [2024-11-02 12:07:50.195976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ff03ff cdw11:ff78ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.384 [2024-11-02 12:07:50.196032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.384 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.384 #33 NEW cov: 11809 ft: 13462 corp: 13/147b lim: 40 exec/s: 0 rss: 68Mb L: 11/20 MS: 1 ChangeByte- 00:08:03.384 [2024-11-02 12:07:50.266177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ff5cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.384 [2024-11-02 12:07:50.266208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.384 #34 NEW cov: 11809 ft: 13489 corp: 14/159b lim: 40 exec/s: 34 rss: 68Mb L: 12/20 MS: 1 InsertByte- 00:08:03.384 [2024-11-02 12:07:50.316379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.384 [2024-11-02 12:07:50.316409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.384 [2024-11-02 12:07:50.316442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff3f0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.384 [2024-11-02 12:07:50.316458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.643 #35 NEW cov: 11809 ft: 13527 corp: 15/179b lim: 40 exec/s: 35 rss: 68Mb L: 20/20 MS: 1 ChangeByte- 00:08:03.643 [2024-11-02 12:07:50.386508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff5c78ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.643 [2024-11-02 12:07:50.386538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.643 #36 NEW cov: 11809 ft: 13549 corp: 16/191b lim: 40 exec/s: 36 rss: 68Mb L: 12/20 MS: 1 ShuffleBytes- 00:08:03.643 [2024-11-02 12:07:50.456745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7b4aff03 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.643 [2024-11-02 12:07:50.456775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.643 [2024-11-02 12:07:50.456824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.643 [2024-11-02 12:07:50.456839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.643 #37 NEW cov: 11809 ft: 13574 corp: 17/208b lim: 40 exec/s: 37 rss: 69Mb L: 17/20 MS: 1 InsertRepeatedBytes- 00:08:03.643 [2024-11-02 12:07:50.526863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:03ff0300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.643 [2024-11-02 12:07:50.526894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.643 #40 NEW cov: 11809 ft: 13576 corp: 18/217b lim: 40 exec/s: 40 rss: 69Mb L: 9/20 MS: 3 ChangeBit-ChangeBit-PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:08:03.643 [2024-11-02 12:07:50.577061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:03474747 cdw11:47474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.643 [2024-11-02 12:07:50.577091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.643 [2024-11-02 12:07:50.577124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:474747ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.643 [2024-11-02 12:07:50.577140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.903 #41 NEW cov: 11809 ft: 13623 corp: 19/240b lim: 40 exec/s: 41 rss: 69Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:03.903 [2024-11-02 12:07:50.647265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:03474747 cdw11:63474747 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.903 [2024-11-02 12:07:50.647295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.903 [2024-11-02 12:07:50.647328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:47474747 cdw11:474747ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.903 [2024-11-02 12:07:50.647344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.903 #42 NEW cov: 11809 ft: 13634 corp: 20/263b lim: 40 exec/s: 42 rss: 69Mb L: 23/23 MS: 1 ChangeByte- 00:08:03.903 [2024-11-02 12:07:50.717388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:03ff0300 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.903 [2024-11-02 12:07:50.717419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.903 #43 NEW cov: 11809 ft: 13648 corp: 21/272b lim: 40 exec/s: 43 rss: 69Mb L: 9/23 MS: 1 CrossOver- 00:08:03.903 [2024-11-02 12:07:50.767506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ffffff7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.903 [2024-11-02 12:07:50.767536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.903 #44 NEW cov: 11809 ft: 13692 corp: 22/283b lim: 40 exec/s: 44 rss: 69Mb L: 11/23 MS: 1 ChangeBit- 00:08:03.903 [2024-11-02 12:07:50.817628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:03ff03ff cdw11:030000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.903 [2024-11-02 12:07:50.817658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.903 #45 NEW cov: 11809 ft: 13711 corp: 23/292b lim: 40 exec/s: 45 rss: 69Mb L: 9/23 MS: 1 CopyPart- 00:08:04.162 [2024-11-02 12:07:50.887817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.162 [2024-11-02 12:07:50.887848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.162 #46 NEW cov: 11809 ft: 13752 corp: 24/306b lim: 40 exec/s: 46 rss: 69Mb L: 14/23 MS: 1 EraseBytes- 00:08:04.162 [2024-11-02 12:07:50.958038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:27535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.162 [2024-11-02 12:07:50.958069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.162 [2024-11-02 12:07:51.008124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:27535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.162 [2024-11-02 12:07:51.008155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.162 #48 NEW cov: 11809 ft: 13775 corp: 25/319b lim: 40 exec/s: 48 rss: 69Mb L: 13/23 MS: 2 ChangeBinInt-CopyPart- 00:08:04.162 [2024-11-02 12:07:51.058263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff5c78ff cdw11:ffff13ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.162 [2024-11-02 12:07:51.058293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.162 #49 NEW cov: 11809 ft: 13783 corp: 26/331b lim: 40 exec/s: 49 rss: 69Mb L: 12/23 MS: 1 ChangeByte- 00:08:04.162 [2024-11-02 12:07:51.118421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0e000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.162 [2024-11-02 12:07:51.118453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.422 #50 NEW cov: 11809 ft: 13820 corp: 27/345b lim: 40 exec/s: 50 rss: 69Mb L: 14/23 MS: 1 ChangeBinInt- 00:08:04.422 [2024-11-02 12:07:51.178690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff5c78ff cdw11:ffff13ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.422 [2024-11-02 12:07:51.178720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.422 [2024-11-02 12:07:51.178767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.422 [2024-11-02 12:07:51.178782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.422 [2024-11-02 12:07:51.178812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.422 [2024-11-02 12:07:51.178827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.422 [2024-11-02 12:07:51.178855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b7b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.422 [2024-11-02 12:07:51.178869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.422 #51 NEW cov: 11809 ft: 14348 corp: 28/382b lim: 40 exec/s: 51 rss: 69Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:04.422 [2024-11-02 12:07:51.248757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:27535353 cdw11:53535353 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.422 [2024-11-02 12:07:51.248788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.422 #52 NEW cov: 11809 ft: 14364 corp: 29/395b lim: 40 exec/s: 52 rss: 69Mb L: 13/37 MS: 1 ChangeBit- 00:08:04.422 [2024-11-02 12:07:51.298940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.422 [2024-11-02 12:07:51.298970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.422 [2024-11-02 12:07:51.299010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff0a cdw11:ff3fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.422 [2024-11-02 12:07:51.299026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.422 #53 NEW cov: 11809 ft: 14383 corp: 30/415b lim: 40 exec/s: 26 rss: 69Mb L: 20/37 MS: 1 ShuffleBytes- 00:08:04.422 #53 DONE cov: 11809 ft: 14383 corp: 30/415b lim: 40 exec/s: 26 rss: 69Mb 00:08:04.422 ###### Recommended dictionary. ###### 00:08:04.422 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:04.422 "\377\003\000\000\000\000\000\000" # Uses: 3 00:08:04.422 ###### End of recommended dictionary. ###### 00:08:04.422 Done 53 runs in 2 second(s) 00:08:04.682 12:07:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:04.682 12:07:51 -- ../common.sh@72 -- # (( i++ )) 00:08:04.682 12:07:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.682 12:07:51 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:04.682 12:07:51 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:04.682 12:07:51 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.682 12:07:51 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.682 12:07:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:04.682 12:07:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:04.682 12:07:51 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:04.682 12:07:51 -- nvmf/run.sh@29 -- # port=4414 00:08:04.682 12:07:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:04.682 12:07:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:04.682 12:07:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.682 12:07:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:04.682 [2024-11-02 12:07:51.493965] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:04.682 [2024-11-02 12:07:51.494052] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1150190 ] 00:08:04.682 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.941 [2024-11-02 12:07:51.750049] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.941 [2024-11-02 12:07:51.778160] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:04.941 [2024-11-02 12:07:51.778295] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.941 [2024-11-02 12:07:51.829571] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.941 [2024-11-02 12:07:51.845946] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:04.941 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.941 INFO: Seed: 2799885934 00:08:04.942 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:04.942 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:04.942 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:04.942 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.942 #2 INITED exec/s: 0 rss: 59Mb 00:08:04.942 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.942 This may also happen if the target rejected all inputs we tried so far 00:08:04.942 [2024-11-02 12:07:51.901115] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-02 12:07:51.901143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.460 NEW_FUNC[1/671]: 0x4654c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:05.460 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.460 #10 NEW cov: 11576 ft: 11577 corp: 2/13b lim: 35 exec/s: 0 rss: 67Mb L: 12/12 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:05.460 [2024-11-02 12:07:52.212853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.212909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.212991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.213026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.213106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.213141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.213219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.213246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.460 #15 NEW cov: 11689 ft: 13208 corp: 3/44b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 5 ChangeBit-InsertByte-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:05.460 [2024-11-02 12:07:52.261962] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.261989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.460 #16 NEW cov: 11695 ft: 13382 corp: 4/56b lim: 35 exec/s: 0 rss: 68Mb L: 12/31 MS: 1 ChangeByte- 00:08:05.460 [2024-11-02 12:07:52.302558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.302584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.302639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.302654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.302707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.302723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.302777] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.302792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.460 #27 NEW cov: 11787 ft: 13599 corp: 5/87b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeBinInt- 00:08:05.460 [2024-11-02 12:07:52.352705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.352732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.352790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.352805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.352864] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.352878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.352935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.352950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.460 #28 NEW cov: 11787 ft: 13677 corp: 6/118b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 CopyPart- 00:08:05.460 [2024-11-02 12:07:52.392762] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.392789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.392851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.392865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.392921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.392936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.392990] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.393009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.460 #29 NEW cov: 11787 ft: 13729 corp: 7/149b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:05.460 [2024-11-02 12:07:52.432897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.432923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.432980] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.432999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.433054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.433070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.460 [2024-11-02 12:07:52.433123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.460 [2024-11-02 12:07:52.433138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.720 #30 NEW cov: 11787 ft: 13787 corp: 8/180b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ShuffleBytes- 00:08:05.720 [2024-11-02 12:07:52.483012] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.483038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.483095] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.483111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.483182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.483198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.483254] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.483269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.720 #31 NEW cov: 11787 ft: 13892 corp: 9/210b lim: 35 exec/s: 0 rss: 68Mb L: 30/31 MS: 1 EraseBytes- 00:08:05.720 [2024-11-02 12:07:52.522656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.522685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.720 #32 NEW cov: 11787 ft: 13977 corp: 10/222b lim: 35 exec/s: 0 rss: 68Mb L: 12/31 MS: 1 CrossOver- 00:08:05.720 [2024-11-02 12:07:52.562985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.563015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.563076] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.563091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.720 #33 NEW cov: 11787 ft: 14221 corp: 11/242b lim: 35 exec/s: 0 rss: 68Mb L: 20/31 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:05.720 [2024-11-02 12:07:52.602900] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.602926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.720 #39 NEW cov: 11787 ft: 14255 corp: 12/253b lim: 35 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 EraseBytes- 00:08:05.720 [2024-11-02 12:07:52.643465] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.643490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.643546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.643562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.643617] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.643631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.643663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.643678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.720 #40 NEW cov: 11787 ft: 14313 corp: 13/284b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeBit- 00:08:05.720 [2024-11-02 12:07:52.683301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.683329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.720 [2024-11-02 12:07:52.683386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.720 [2024-11-02 12:07:52.683402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.980 #41 NEW cov: 11787 ft: 14357 corp: 14/301b lim: 35 exec/s: 0 rss: 68Mb L: 17/31 MS: 1 EraseBytes- 00:08:05.980 [2024-11-02 12:07:52.733715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.980 [2024-11-02 12:07:52.733742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.980 [2024-11-02 12:07:52.733800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.980 [2024-11-02 12:07:52.733815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.980 [2024-11-02 12:07:52.733885] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.980 [2024-11-02 12:07:52.733901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.980 [2024-11-02 12:07:52.733957] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.980 [2024-11-02 12:07:52.733972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.980 #42 NEW cov: 11787 ft: 14392 corp: 15/332b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ShuffleBytes- 00:08:05.980 [2024-11-02 12:07:52.783881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.980 [2024-11-02 12:07:52.783909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.980 [2024-11-02 12:07:52.783968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.980 [2024-11-02 12:07:52.783984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.784062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.784078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.784136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.784154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.981 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:05.981 #43 NEW cov: 11810 ft: 14425 corp: 16/365b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 CrossOver- 00:08:05.981 [2024-11-02 12:07:52.823659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.823685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.823744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.823759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.981 #44 NEW cov: 11810 ft: 14518 corp: 17/381b lim: 35 exec/s: 0 rss: 69Mb L: 16/33 MS: 1 EraseBytes- 00:08:05.981 [2024-11-02 12:07:52.864070] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.864097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.864169] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.864185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.864240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.864259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.864314] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.864329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.981 #45 NEW cov: 11810 ft: 14537 corp: 18/414b lim: 35 exec/s: 45 rss: 69Mb L: 33/33 MS: 1 ChangeByte- 00:08:05.981 [2024-11-02 12:07:52.904226] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.904251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.904322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.904339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.904395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.904408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.904464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.904479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.981 #46 NEW cov: 11810 ft: 14552 corp: 19/444b lim: 35 exec/s: 46 rss: 69Mb L: 30/33 MS: 1 ShuffleBytes- 00:08:05.981 [2024-11-02 12:07:52.944341] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.944368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.944424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.944440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.944494] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.944508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.981 [2024-11-02 12:07:52.944562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.981 [2024-11-02 12:07:52.944577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.240 #47 NEW cov: 11810 ft: 14561 corp: 20/477b lim: 35 exec/s: 47 rss: 69Mb L: 33/33 MS: 1 ChangeBit- 00:08:06.240 [2024-11-02 12:07:52.984429] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.240 [2024-11-02 12:07:52.984457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.240 [2024-11-02 12:07:52.984515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.240 [2024-11-02 12:07:52.984531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.240 [2024-11-02 12:07:52.984606] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:52.984623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:52.984677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:52.984690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.241 #48 NEW cov: 11810 ft: 14595 corp: 21/508b lim: 35 exec/s: 48 rss: 69Mb L: 31/33 MS: 1 ShuffleBytes- 00:08:06.241 [2024-11-02 12:07:53.024388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.024415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.024474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.024490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.024548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.024563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.241 #49 NEW cov: 11810 ft: 14781 corp: 22/531b lim: 35 exec/s: 49 rss: 69Mb L: 23/33 MS: 1 EraseBytes- 00:08:06.241 [2024-11-02 12:07:53.064636] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.064663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.064738] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.064753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.064812] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.064826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.064883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.064898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.241 #50 NEW cov: 11810 ft: 14841 corp: 23/562b lim: 35 exec/s: 50 rss: 69Mb L: 31/33 MS: 1 ChangeBinInt- 00:08:06.241 [2024-11-02 12:07:53.104800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.104828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.104884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.104900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.104955] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.104973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.105048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.105065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.241 #51 NEW cov: 11810 ft: 14894 corp: 24/594b lim: 35 exec/s: 51 rss: 69Mb L: 32/33 MS: 1 CopyPart- 00:08:06.241 [2024-11-02 12:07:53.144754] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.144780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.144851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.144867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.144925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.144940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.241 #52 NEW cov: 11810 ft: 14926 corp: 25/621b lim: 35 exec/s: 52 rss: 69Mb L: 27/33 MS: 1 CrossOver- 00:08:06.241 [2024-11-02 12:07:53.184892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.184918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.184976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.184991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.241 [2024-11-02 12:07:53.185052] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.241 [2024-11-02 12:07:53.185068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.241 #53 NEW cov: 11810 ft: 14945 corp: 26/644b lim: 35 exec/s: 53 rss: 69Mb L: 23/33 MS: 1 ChangeBit- 00:08:06.501 [2024-11-02 12:07:53.225183] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.225207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.225263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.225279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.225334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.225348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.225403] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.225419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.501 #54 NEW cov: 11810 ft: 14973 corp: 27/676b lim: 35 exec/s: 54 rss: 69Mb L: 32/33 MS: 1 CrossOver- 00:08:06.501 [2024-11-02 12:07:53.265172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.265198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.265271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.265287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.265343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.265359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.501 #55 NEW cov: 11810 ft: 14995 corp: 28/698b lim: 35 exec/s: 55 rss: 69Mb L: 22/33 MS: 1 EraseBytes- 00:08:06.501 [2024-11-02 12:07:53.305273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.305300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.305359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.305376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.305436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.305450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.501 #56 NEW cov: 11810 ft: 15004 corp: 29/725b lim: 35 exec/s: 56 rss: 69Mb L: 27/33 MS: 1 CopyPart- 00:08:06.501 [2024-11-02 12:07:53.345569] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.345595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.345650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.345666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.345722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.345737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.345788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.345803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.501 #57 NEW cov: 11810 ft: 15021 corp: 30/757b lim: 35 exec/s: 57 rss: 69Mb L: 32/33 MS: 1 InsertRepeatedBytes- 00:08:06.501 [2024-11-02 12:07:53.385804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.385831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.385889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.385904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.385959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.385975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.386048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.386065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.386120] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.386135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.501 #58 NEW cov: 11810 ft: 15077 corp: 31/792b lim: 35 exec/s: 58 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:06.501 [2024-11-02 12:07:53.425422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.425448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.425505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.425521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.501 #59 NEW cov: 11810 ft: 15146 corp: 32/812b lim: 35 exec/s: 59 rss: 70Mb L: 20/35 MS: 1 EraseBytes- 00:08:06.501 [2024-11-02 12:07:53.465901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.465927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.465986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.466004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.466062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.466078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.501 [2024-11-02 12:07:53.466131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.501 [2024-11-02 12:07:53.466147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.761 #60 NEW cov: 11810 ft: 15158 corp: 33/842b lim: 35 exec/s: 60 rss: 70Mb L: 30/35 MS: 1 EraseBytes- 00:08:06.761 [2024-11-02 12:07:53.505688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.761 [2024-11-02 12:07:53.505714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.761 [2024-11-02 12:07:53.505769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.761 [2024-11-02 12:07:53.505788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.761 #61 NEW cov: 11810 ft: 15160 corp: 34/862b lim: 35 exec/s: 61 rss: 70Mb L: 20/35 MS: 1 EraseBytes- 00:08:06.761 [2024-11-02 12:07:53.546124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.761 [2024-11-02 12:07:53.546148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.761 [2024-11-02 12:07:53.546206] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.761 [2024-11-02 12:07:53.546221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.761 [2024-11-02 12:07:53.546292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.761 [2024-11-02 12:07:53.546306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.761 [2024-11-02 12:07:53.546364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.761 [2024-11-02 12:07:53.546379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.761 #62 NEW cov: 11810 ft: 15172 corp: 35/893b lim: 35 exec/s: 62 rss: 70Mb L: 31/35 MS: 1 EraseBytes- 00:08:06.761 [2024-11-02 12:07:53.596417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.596443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.596499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.596515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.596573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.596588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.596642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.596657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.596712] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.596728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.762 #63 NEW cov: 11810 ft: 15178 corp: 36/928b lim: 35 exec/s: 63 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\377\377\000\000"- 00:08:06.762 [2024-11-02 12:07:53.636354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.636379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.636434] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.636450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.636488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.636503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.636559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.636574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.762 #64 NEW cov: 11810 ft: 15202 corp: 37/960b lim: 35 exec/s: 64 rss: 70Mb L: 32/35 MS: 1 ChangeByte- 00:08:06.762 [2024-11-02 12:07:53.676424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.676448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.676560] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.676576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.676629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.676645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.762 NEW_FUNC[1/2]: 0x481108 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:08:06.762 NEW_FUNC[2/2]: 0x1146588 in nvmf_ctrlr_set_features_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1503 00:08:06.762 #65 NEW cov: 11860 ft: 15267 corp: 38/992b lim: 35 exec/s: 65 rss: 70Mb L: 32/35 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:06.762 [2024-11-02 12:07:53.726629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.726659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.726724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.726742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.726802] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.726820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.762 [2024-11-02 12:07:53.726877] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.762 [2024-11-02 12:07:53.726894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.022 #66 NEW cov: 11860 ft: 15287 corp: 39/1024b lim: 35 exec/s: 66 rss: 70Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:08:07.022 [2024-11-02 12:07:53.766688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.766712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.766769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.766784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.766843] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.766857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.766912] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.766927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.022 #67 NEW cov: 11860 ft: 15304 corp: 40/1055b lim: 35 exec/s: 67 rss: 70Mb L: 31/35 MS: 1 ChangeByte- 00:08:07.022 [2024-11-02 12:07:53.806491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.806516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.806574] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.806590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.022 #68 NEW cov: 11860 ft: 15333 corp: 41/1072b lim: 35 exec/s: 68 rss: 70Mb L: 17/35 MS: 1 CMP- DE: "\000\000\000\003"- 00:08:07.022 [2024-11-02 12:07:53.846907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.846933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.846992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.847011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.847072] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.847087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.847148] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.847164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.022 #69 NEW cov: 11860 ft: 15351 corp: 42/1100b lim: 35 exec/s: 69 rss: 70Mb L: 28/35 MS: 1 InsertByte- 00:08:07.022 [2024-11-02 12:07:53.887016] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.887042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.887098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.887114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.887171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.887186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.022 [2024-11-02 12:07:53.887245] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.022 [2024-11-02 12:07:53.887263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.022 #70 NEW cov: 11860 ft: 15363 corp: 43/1134b lim: 35 exec/s: 35 rss: 70Mb L: 34/35 MS: 1 InsertByte- 00:08:07.022 #70 DONE cov: 11860 ft: 15363 corp: 43/1134b lim: 35 exec/s: 35 rss: 70Mb 00:08:07.022 ###### Recommended dictionary. ###### 00:08:07.022 "\002\000\000\000" # Uses: 1 00:08:07.022 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:07.022 "\377\377\000\000" # Uses: 0 00:08:07.022 "\000\000\000\003" # Uses: 0 00:08:07.022 ###### End of recommended dictionary. ###### 00:08:07.022 Done 70 runs in 2 second(s) 00:08:07.283 12:07:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:07.283 12:07:54 -- ../common.sh@72 -- # (( i++ )) 00:08:07.283 12:07:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.283 12:07:54 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:07.283 12:07:54 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:07.283 12:07:54 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.283 12:07:54 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.283 12:07:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:07.283 12:07:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:07.283 12:07:54 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:07.283 12:07:54 -- nvmf/run.sh@29 -- # port=4415 00:08:07.283 12:07:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:07.283 12:07:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:07.283 12:07:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.283 12:07:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:07.283 [2024-11-02 12:07:54.061045] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:07.283 [2024-11-02 12:07:54.061114] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1150568 ] 00:08:07.283 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.543 [2024-11-02 12:07:54.312090] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.543 [2024-11-02 12:07:54.339081] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.543 [2024-11-02 12:07:54.339217] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.543 [2024-11-02 12:07:54.390704] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.543 [2024-11-02 12:07:54.407101] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:07.543 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.543 INFO: Seed: 1066931501 00:08:07.543 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:07.543 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:07.543 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:07.543 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.543 #2 INITED exec/s: 0 rss: 59Mb 00:08:07.543 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.543 This may also happen if the target rejected all inputs we tried so far 00:08:07.543 [2024-11-02 12:07:54.484398] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.543 [2024-11-02 12:07:54.484440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.543 [2024-11-02 12:07:54.484523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.543 [2024-11-02 12:07:54.484541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.543 [2024-11-02 12:07:54.484626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.543 [2024-11-02 12:07:54.484642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.112 NEW_FUNC[1/669]: 0x466a08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:08.112 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.112 #3 NEW cov: 11560 ft: 11562 corp: 2/27b lim: 35 exec/s: 0 rss: 66Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:08:08.112 [2024-11-02 12:07:54.814614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.112 [2024-11-02 12:07:54.814653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.112 [2024-11-02 12:07:54.814782] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.112 [2024-11-02 12:07:54.814798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.112 NEW_FUNC[1/2]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:08.112 NEW_FUNC[2/2]: 0x1722148 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1090 00:08:08.112 #7 NEW cov: 11691 ft: 12334 corp: 3/52b lim: 35 exec/s: 0 rss: 67Mb L: 25/26 MS: 4 CopyPart-InsertByte-CopyPart-CrossOver- 00:08:08.112 [2024-11-02 12:07:54.854431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.112 [2024-11-02 12:07:54.854462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.112 #13 NEW cov: 11697 ft: 12839 corp: 4/69b lim: 35 exec/s: 0 rss: 67Mb L: 17/26 MS: 1 EraseBytes- 00:08:08.112 [2024-11-02 12:07:54.904726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.112 [2024-11-02 12:07:54.904754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.112 [2024-11-02 12:07:54.904890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.112 [2024-11-02 12:07:54.904906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.112 [2024-11-02 12:07:54.905035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.112 [2024-11-02 12:07:54.905052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.112 #14 NEW cov: 11782 ft: 13099 corp: 5/96b lim: 35 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:08:08.112 [2024-11-02 12:07:54.945033] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:54.945060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.113 [2024-11-02 12:07:54.945203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:54.945225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.113 [2024-11-02 12:07:54.945331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:54.945349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.113 #15 NEW cov: 11782 ft: 13514 corp: 6/127b lim: 35 exec/s: 0 rss: 67Mb L: 31/31 MS: 1 CopyPart- 00:08:08.113 [2024-11-02 12:07:54.984634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:54.984661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.113 [2024-11-02 12:07:54.984800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:54.984818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.113 #17 NEW cov: 11782 ft: 13653 corp: 7/146b lim: 35 exec/s: 0 rss: 67Mb L: 19/31 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:08.113 [2024-11-02 12:07:55.024731] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:55.024758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.113 [2024-11-02 12:07:55.024884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:55.024902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.113 #18 NEW cov: 11782 ft: 13740 corp: 8/165b lim: 35 exec/s: 0 rss: 67Mb L: 19/31 MS: 1 CopyPart- 00:08:08.113 [2024-11-02 12:07:55.064953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.113 [2024-11-02 12:07:55.064980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.373 #29 NEW cov: 11782 ft: 13803 corp: 9/185b lim: 35 exec/s: 0 rss: 67Mb L: 20/31 MS: 1 EraseBytes- 00:08:08.373 [2024-11-02 12:07:55.105238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.105267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.105401] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.105420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.105560] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.105579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.373 #30 NEW cov: 11782 ft: 13893 corp: 10/211b lim: 35 exec/s: 0 rss: 67Mb L: 26/31 MS: 1 ChangeBit- 00:08:08.373 [2024-11-02 12:07:55.145412] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.145439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.145583] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.145600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.145738] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.145754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.373 #31 NEW cov: 11782 ft: 13966 corp: 11/238b lim: 35 exec/s: 0 rss: 67Mb L: 27/31 MS: 1 ShuffleBytes- 00:08:08.373 [2024-11-02 12:07:55.185601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.185628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.185772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.185789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.185922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.185940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.373 #32 NEW cov: 11782 ft: 14000 corp: 12/264b lim: 35 exec/s: 0 rss: 68Mb L: 26/31 MS: 1 ChangeByte- 00:08:08.373 [2024-11-02 12:07:55.225848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.225876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.226005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.226022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.226173] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.226192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.226328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.226344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.373 #33 NEW cov: 11782 ft: 14156 corp: 13/294b lim: 35 exec/s: 0 rss: 68Mb L: 30/31 MS: 1 CopyPart- 00:08:08.373 [2024-11-02 12:07:55.266159] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.266185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.266320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.266339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.266474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.266490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.266631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.266651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.266792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000005bb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.266810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.373 #34 NEW cov: 11782 ft: 14229 corp: 14/329b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:08.373 [2024-11-02 12:07:55.305663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.305690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.373 [2024-11-02 12:07:55.305810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.373 [2024-11-02 12:07:55.305828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.373 #35 NEW cov: 11782 ft: 14244 corp: 15/349b lim: 35 exec/s: 0 rss: 69Mb L: 20/35 MS: 1 InsertByte- 00:08:08.373 [2024-11-02 12:07:55.346099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.374 [2024-11-02 12:07:55.346127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.374 [2024-11-02 12:07:55.346256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.374 [2024-11-02 12:07:55.346273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.374 [2024-11-02 12:07:55.346404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.374 [2024-11-02 12:07:55.346422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.634 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.634 #36 NEW cov: 11805 ft: 14306 corp: 16/376b lim: 35 exec/s: 0 rss: 69Mb L: 27/35 MS: 1 CrossOver- 00:08:08.635 [2024-11-02 12:07:55.396411] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.396440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.396582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.396597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.396736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.396754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.396878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.396895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.635 #37 NEW cov: 11805 ft: 14324 corp: 17/409b lim: 35 exec/s: 0 rss: 69Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:08.635 [2024-11-02 12:07:55.436608] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.436637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.436759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.436774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.436894] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.436909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.635 #38 NEW cov: 11805 ft: 14359 corp: 18/440b lim: 35 exec/s: 38 rss: 69Mb L: 31/35 MS: 1 ShuffleBytes- 00:08:08.635 [2024-11-02 12:07:55.476440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.476469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.476603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.476620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.476760] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.476776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.635 #39 NEW cov: 11805 ft: 14375 corp: 19/467b lim: 35 exec/s: 39 rss: 69Mb L: 27/35 MS: 1 ChangeByte- 00:08:08.635 [2024-11-02 12:07:55.516807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.516834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.516969] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.516986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.517130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.517148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.517288] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.517306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.635 #40 NEW cov: 11805 ft: 14400 corp: 20/500b lim: 35 exec/s: 40 rss: 69Mb L: 33/35 MS: 1 CopyPart- 00:08:08.635 [2024-11-02 12:07:55.557043] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.557071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.557207] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.557223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.557348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.557368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.635 #41 NEW cov: 11805 ft: 14407 corp: 21/534b lim: 35 exec/s: 41 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:08.635 [2024-11-02 12:07:55.596912] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.596940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.597076] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.597106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.635 [2024-11-02 12:07:55.597236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.635 [2024-11-02 12:07:55.597255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.895 #42 NEW cov: 11805 ft: 14459 corp: 22/561b lim: 35 exec/s: 42 rss: 69Mb L: 27/35 MS: 1 CrossOver- 00:08:08.895 [2024-11-02 12:07:55.636932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.636960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.637097] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.637114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.637241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.637258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.895 #43 NEW cov: 11805 ft: 14486 corp: 23/588b lim: 35 exec/s: 43 rss: 69Mb L: 27/35 MS: 1 ChangeByte- 00:08:08.895 [2024-11-02 12:07:55.677286] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.677315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.677450] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.677470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.677599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.677620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.677753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.677770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.895 #44 NEW cov: 11805 ft: 14502 corp: 24/621b lim: 35 exec/s: 44 rss: 69Mb L: 33/35 MS: 1 CMP- DE: "\006\000\000\000\000\000\000\000"- 00:08:08.895 [2024-11-02 12:07:55.717269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.717299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.717377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.717395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.895 #45 NEW cov: 11805 ft: 14520 corp: 25/645b lim: 35 exec/s: 45 rss: 69Mb L: 24/35 MS: 1 EraseBytes- 00:08:08.895 [2024-11-02 12:07:55.757468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.757497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.757640] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.757658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.757790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.757808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.757942] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.757962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.895 #46 NEW cov: 11805 ft: 14562 corp: 26/678b lim: 35 exec/s: 46 rss: 69Mb L: 33/35 MS: 1 ChangeByte- 00:08:08.895 [2024-11-02 12:07:55.807682] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.807711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.807838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000001a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.807858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.807997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.808016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.808147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.808165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.895 #47 NEW cov: 11805 ft: 14573 corp: 27/711b lim: 35 exec/s: 47 rss: 69Mb L: 33/35 MS: 1 ChangeByte- 00:08:08.895 [2024-11-02 12:07:55.857443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.857471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.857599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.857618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.857761] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000002ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.857786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.895 [2024-11-02 12:07:55.857919] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.895 [2024-11-02 12:07:55.857936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.155 #48 NEW cov: 11805 ft: 14588 corp: 28/742b lim: 35 exec/s: 48 rss: 70Mb L: 31/35 MS: 1 InsertByte- 00:08:09.155 [2024-11-02 12:07:55.907522] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.907551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.155 [2024-11-02 12:07:55.907686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.907703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.155 [2024-11-02 12:07:55.907834] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.907853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.155 [2024-11-02 12:07:55.907984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.908006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.155 #49 NEW cov: 11805 ft: 14604 corp: 29/775b lim: 35 exec/s: 49 rss: 70Mb L: 33/35 MS: 1 ChangeBit- 00:08:09.155 [2024-11-02 12:07:55.947809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.947840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.155 [2024-11-02 12:07:55.947986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.948008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.155 [2024-11-02 12:07:55.948134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.948151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.155 #50 NEW cov: 11805 ft: 14622 corp: 30/802b lim: 35 exec/s: 50 rss: 70Mb L: 27/35 MS: 1 ChangeBit- 00:08:09.155 [2024-11-02 12:07:55.997830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.997860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.155 [2024-11-02 12:07:55.997999] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.998018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.155 [2024-11-02 12:07:55.998157] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.155 [2024-11-02 12:07:55.998175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.155 #51 NEW cov: 11805 ft: 14639 corp: 31/829b lim: 35 exec/s: 51 rss: 70Mb L: 27/35 MS: 1 ChangeByte- 00:08:09.155 [2024-11-02 12:07:56.038291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.038319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.156 [2024-11-02 12:07:56.038468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.038486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.156 [2024-11-02 12:07:56.038623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.038642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.156 [2024-11-02 12:07:56.038781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.038799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.156 #52 NEW cov: 11805 ft: 14649 corp: 32/862b lim: 35 exec/s: 52 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:08:09.156 [2024-11-02 12:07:56.088144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.088173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.156 [2024-11-02 12:07:56.088297] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.088314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.156 #53 NEW cov: 11805 ft: 14650 corp: 33/887b lim: 35 exec/s: 53 rss: 70Mb L: 25/35 MS: 1 ChangeBit- 00:08:09.156 [2024-11-02 12:07:56.128540] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.128569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.156 [2024-11-02 12:07:56.128704] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.128722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.156 [2024-11-02 12:07:56.128841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000071d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.128859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.156 [2024-11-02 12:07:56.128988] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.156 [2024-11-02 12:07:56.129012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.416 #54 NEW cov: 11805 ft: 14661 corp: 34/922b lim: 35 exec/s: 54 rss: 70Mb L: 35/35 MS: 1 InsertByte- 00:08:09.416 [2024-11-02 12:07:56.178429] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.178458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.416 #55 NEW cov: 11805 ft: 14672 corp: 35/936b lim: 35 exec/s: 55 rss: 70Mb L: 14/35 MS: 1 CrossOver- 00:08:09.416 [2024-11-02 12:07:56.218896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.218929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.219063] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.219082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.219216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.219234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.219342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.219359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.416 #56 NEW cov: 11805 ft: 14697 corp: 36/970b lim: 35 exec/s: 56 rss: 70Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:09.416 [2024-11-02 12:07:56.268507] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.268535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.268670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.268689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.416 #57 NEW cov: 11805 ft: 14763 corp: 37/990b lim: 35 exec/s: 57 rss: 70Mb L: 20/35 MS: 1 EraseBytes- 00:08:09.416 [2024-11-02 12:07:56.309022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.309049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.309184] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.309202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.309337] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000049e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.309355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.309488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.309505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.416 #58 NEW cov: 11805 ft: 14773 corp: 38/1024b lim: 35 exec/s: 58 rss: 70Mb L: 34/35 MS: 1 CopyPart- 00:08:09.416 [2024-11-02 12:07:56.348975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.349006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.349145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.349169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.349300] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.349319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.416 #59 NEW cov: 11805 ft: 14782 corp: 39/1051b lim: 35 exec/s: 59 rss: 70Mb L: 27/35 MS: 1 ChangeBit- 00:08:09.416 [2024-11-02 12:07:56.389382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.389410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.389543] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.389561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.389696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.389713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.416 [2024-11-02 12:07:56.389845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.416 [2024-11-02 12:07:56.389862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.677 #60 NEW cov: 11805 ft: 14792 corp: 40/1085b lim: 35 exec/s: 60 rss: 70Mb L: 34/35 MS: 1 ChangeBit- 00:08:09.677 [2024-11-02 12:07:56.429346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.677 [2024-11-02 12:07:56.429375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.677 [2024-11-02 12:07:56.429514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:09.677 [2024-11-02 12:07:56.429530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.677 #61 NEW cov: 11805 ft: 14814 corp: 41/1110b lim: 35 exec/s: 30 rss: 70Mb L: 25/35 MS: 1 ChangeByte- 00:08:09.677 #61 DONE cov: 11805 ft: 14814 corp: 41/1110b lim: 35 exec/s: 30 rss: 70Mb 00:08:09.677 ###### Recommended dictionary. ###### 00:08:09.677 "\006\000\000\000\000\000\000\000" # Uses: 0 00:08:09.677 ###### End of recommended dictionary. ###### 00:08:09.677 Done 61 runs in 2 second(s) 00:08:09.677 12:07:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:09.677 12:07:56 -- ../common.sh@72 -- # (( i++ )) 00:08:09.677 12:07:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.677 12:07:56 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:09.677 12:07:56 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:09.677 12:07:56 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.677 12:07:56 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.677 12:07:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:09.677 12:07:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:09.677 12:07:56 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:09.677 12:07:56 -- nvmf/run.sh@29 -- # port=4416 00:08:09.677 12:07:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:09.677 12:07:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:09.677 12:07:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.677 12:07:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:09.677 [2024-11-02 12:07:56.610683] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:09.677 [2024-11-02 12:07:56.610743] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1151105 ] 00:08:09.677 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.936 [2024-11-02 12:07:56.869477] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.936 [2024-11-02 12:07:56.898902] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:09.936 [2024-11-02 12:07:56.899032] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.196 [2024-11-02 12:07:56.950513] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.196 [2024-11-02 12:07:56.966885] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:10.196 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.196 INFO: Seed: 3624944697 00:08:10.196 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:10.196 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:10.196 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:10.196 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.196 #2 INITED exec/s: 0 rss: 59Mb 00:08:10.196 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.196 This may also happen if the target rejected all inputs we tried so far 00:08:10.196 [2024-11-02 12:07:57.015231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.196 [2024-11-02 12:07:57.015261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.456 NEW_FUNC[1/669]: 0x467ec8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:10.456 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.456 #21 NEW cov: 11629 ft: 11668 corp: 2/30b lim: 105 exec/s: 0 rss: 67Mb L: 29/29 MS: 4 ChangeBinInt-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:10.456 [2024-11-02 12:07:57.316009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.456 [2024-11-02 12:07:57.316042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.456 NEW_FUNC[1/2]: 0x19613b8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:08:10.456 NEW_FUNC[2/2]: 0x1966848 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:08:10.456 #22 NEW cov: 11780 ft: 12207 corp: 3/59b lim: 105 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeByte- 00:08:10.456 [2024-11-02 12:07:57.366086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.456 [2024-11-02 12:07:57.366116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.456 #28 NEW cov: 11786 ft: 12378 corp: 4/88b lim: 105 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CrossOver- 00:08:10.456 [2024-11-02 12:07:57.406502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12659530243895177135 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.456 [2024-11-02 12:07:57.406534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.456 [2024-11-02 12:07:57.406570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.456 [2024-11-02 12:07:57.406589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.456 [2024-11-02 12:07:57.406640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.456 [2024-11-02 12:07:57.406655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.456 [2024-11-02 12:07:57.406707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.456 [2024-11-02 12:07:57.406723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.456 #29 NEW cov: 11871 ft: 13272 corp: 5/187b lim: 105 exec/s: 0 rss: 67Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:10.716 [2024-11-02 12:07:57.446253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.446280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.716 #32 NEW cov: 11871 ft: 13493 corp: 6/226b lim: 105 exec/s: 0 rss: 67Mb L: 39/99 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:10.716 [2024-11-02 12:07:57.486374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.486400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.716 #33 NEW cov: 11871 ft: 13556 corp: 7/255b lim: 105 exec/s: 0 rss: 67Mb L: 29/99 MS: 1 CrossOver- 00:08:10.716 [2024-11-02 12:07:57.526524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.526552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.716 #34 NEW cov: 11871 ft: 13603 corp: 8/284b lim: 105 exec/s: 0 rss: 67Mb L: 29/99 MS: 1 ChangeBinInt- 00:08:10.716 [2024-11-02 12:07:57.566660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.566689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.716 #35 NEW cov: 11871 ft: 13685 corp: 9/312b lim: 105 exec/s: 0 rss: 67Mb L: 28/99 MS: 1 EraseBytes- 00:08:10.716 [2024-11-02 12:07:57.607248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12659530243895177135 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.607275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.716 [2024-11-02 12:07:57.607342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.607358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.716 [2024-11-02 12:07:57.607412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.607426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.716 [2024-11-02 12:07:57.607478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.607493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.716 [2024-11-02 12:07:57.607551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.607566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:10.716 #36 NEW cov: 11871 ft: 13818 corp: 10/417b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 CopyPart- 00:08:10.716 [2024-11-02 12:07:57.656859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493824 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.716 [2024-11-02 12:07:57.656887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.716 #37 NEW cov: 11871 ft: 13874 corp: 11/447b lim: 105 exec/s: 0 rss: 67Mb L: 30/105 MS: 1 InsertByte- 00:08:10.975 [2024-11-02 12:07:57.697016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8797082877952 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.975 [2024-11-02 12:07:57.697044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.975 #38 NEW cov: 11871 ft: 13952 corp: 12/476b lim: 105 exec/s: 0 rss: 68Mb L: 29/105 MS: 1 ChangeBit- 00:08:10.975 [2024-11-02 12:07:57.737209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.737237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.976 [2024-11-02 12:07:57.737274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.737290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.976 #39 NEW cov: 11871 ft: 14307 corp: 13/537b lim: 105 exec/s: 0 rss: 68Mb L: 61/105 MS: 1 InsertRepeatedBytes- 00:08:10.976 [2024-11-02 12:07:57.777339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.777367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.976 [2024-11-02 12:07:57.777400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.777416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.976 #40 NEW cov: 11871 ft: 14334 corp: 14/598b lim: 105 exec/s: 0 rss: 68Mb L: 61/105 MS: 1 ChangeBinInt- 00:08:10.976 [2024-11-02 12:07:57.817461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.817490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.976 [2024-11-02 12:07:57.817543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.817559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.976 #46 NEW cov: 11871 ft: 14435 corp: 15/655b lim: 105 exec/s: 0 rss: 68Mb L: 57/105 MS: 1 CrossOver- 00:08:10.976 [2024-11-02 12:07:57.857459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.857487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.976 #47 NEW cov: 11871 ft: 14447 corp: 16/676b lim: 105 exec/s: 0 rss: 68Mb L: 21/105 MS: 1 EraseBytes- 00:08:10.976 [2024-11-02 12:07:57.897549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3866624 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.897578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.976 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:10.976 #48 NEW cov: 11894 ft: 14507 corp: 17/705b lim: 105 exec/s: 0 rss: 68Mb L: 29/105 MS: 1 ShuffleBytes- 00:08:10.976 [2024-11-02 12:07:57.937706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8797082877952 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.976 [2024-11-02 12:07:57.937734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.235 #49 NEW cov: 11894 ft: 14544 corp: 18/734b lim: 105 exec/s: 0 rss: 68Mb L: 29/105 MS: 1 ShuffleBytes- 00:08:11.235 [2024-11-02 12:07:57.977808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:57.977835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.235 #50 NEW cov: 11894 ft: 14645 corp: 19/762b lim: 105 exec/s: 50 rss: 68Mb L: 28/105 MS: 1 ShuffleBytes- 00:08:11.235 [2024-11-02 12:07:58.017918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:58.017946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.235 #51 NEW cov: 11894 ft: 14663 corp: 20/783b lim: 105 exec/s: 51 rss: 68Mb L: 21/105 MS: 1 ChangeBinInt- 00:08:11.235 [2024-11-02 12:07:58.058043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:58.058069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.235 #52 NEW cov: 11894 ft: 14664 corp: 21/804b lim: 105 exec/s: 52 rss: 68Mb L: 21/105 MS: 1 ChangeByte- 00:08:11.235 [2024-11-02 12:07:58.098534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:58.098562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.235 [2024-11-02 12:07:58.098609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:58.098625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.235 [2024-11-02 12:07:58.098677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:58.098707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.235 [2024-11-02 12:07:58.098762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:58.098777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.235 #53 NEW cov: 11894 ft: 14692 corp: 22/891b lim: 105 exec/s: 53 rss: 68Mb L: 87/105 MS: 1 CopyPart- 00:08:11.235 [2024-11-02 12:07:58.138393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.235 [2024-11-02 12:07:58.138420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.236 [2024-11-02 12:07:58.138471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4096 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.236 [2024-11-02 12:07:58.138487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.236 #54 NEW cov: 11894 ft: 14711 corp: 23/952b lim: 105 exec/s: 54 rss: 68Mb L: 61/105 MS: 1 ChangeBit- 00:08:11.236 [2024-11-02 12:07:58.178345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069853544447 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.236 [2024-11-02 12:07:58.178371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.236 #55 NEW cov: 11894 ft: 14787 corp: 24/980b lim: 105 exec/s: 55 rss: 68Mb L: 28/105 MS: 1 EraseBytes- 00:08:11.236 [2024-11-02 12:07:58.208460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.236 [2024-11-02 12:07:58.208487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.495 #56 NEW cov: 11894 ft: 14798 corp: 25/1012b lim: 105 exec/s: 56 rss: 68Mb L: 32/105 MS: 1 EraseBytes- 00:08:11.495 [2024-11-02 12:07:58.248675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1852702720 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.248702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.495 [2024-11-02 12:07:58.248756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.248772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.495 #57 NEW cov: 11894 ft: 14841 corp: 26/1073b lim: 105 exec/s: 57 rss: 68Mb L: 61/105 MS: 1 CrossOver- 00:08:11.495 [2024-11-02 12:07:58.288693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.288720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.495 #58 NEW cov: 11894 ft: 14853 corp: 27/1113b lim: 105 exec/s: 58 rss: 69Mb L: 40/105 MS: 1 CopyPart- 00:08:11.495 [2024-11-02 12:07:58.328796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.328823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.495 #59 NEW cov: 11894 ft: 14861 corp: 28/1152b lim: 105 exec/s: 59 rss: 69Mb L: 39/105 MS: 1 InsertRepeatedBytes- 00:08:11.495 [2024-11-02 12:07:58.368935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.368961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.495 #60 NEW cov: 11894 ft: 14937 corp: 29/1192b lim: 105 exec/s: 60 rss: 69Mb L: 40/105 MS: 1 ChangeByte- 00:08:11.495 [2024-11-02 12:07:58.409142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:829320704 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.409168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.495 [2024-11-02 12:07:58.409224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.409240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.495 #61 NEW cov: 11894 ft: 14978 corp: 30/1254b lim: 105 exec/s: 61 rss: 69Mb L: 62/105 MS: 1 InsertByte- 00:08:11.495 [2024-11-02 12:07:58.449520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.449547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.495 [2024-11-02 12:07:58.449612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.449628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.495 [2024-11-02 12:07:58.449681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.449695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.495 [2024-11-02 12:07:58.449747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.495 [2024-11-02 12:07:58.449763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.755 #62 NEW cov: 11894 ft: 14996 corp: 31/1341b lim: 105 exec/s: 62 rss: 69Mb L: 87/105 MS: 1 InsertRepeatedBytes- 00:08:11.755 [2024-11-02 12:07:58.489622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.489648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.489714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.489731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.489783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.489797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.489852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.489868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.755 #63 NEW cov: 11894 ft: 15008 corp: 32/1428b lim: 105 exec/s: 63 rss: 69Mb L: 87/105 MS: 1 ChangeBit- 00:08:11.755 [2024-11-02 12:07:58.529395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.529422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.755 #64 NEW cov: 11894 ft: 15065 corp: 33/1460b lim: 105 exec/s: 64 rss: 69Mb L: 32/105 MS: 1 ChangeBit- 00:08:11.755 [2024-11-02 12:07:58.569653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:177939152960 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.569680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.569722] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446462603027808255 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.569738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.609754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:177939152960 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.609781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.609819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.609834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.755 #66 NEW cov: 11894 ft: 15068 corp: 34/1510b lim: 105 exec/s: 66 rss: 69Mb L: 50/105 MS: 2 CrossOver-CrossOver- 00:08:11.755 [2024-11-02 12:07:58.649774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989914624 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.649801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.755 #67 NEW cov: 11894 ft: 15119 corp: 35/1540b lim: 105 exec/s: 67 rss: 69Mb L: 30/105 MS: 1 InsertByte- 00:08:11.755 [2024-11-02 12:07:58.689998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.690026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.690065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.690080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.755 #68 NEW cov: 11894 ft: 15124 corp: 36/1602b lim: 105 exec/s: 68 rss: 69Mb L: 62/105 MS: 1 InsertByte- 00:08:11.755 [2024-11-02 12:07:58.730377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.730404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.730459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.730475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.730528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.755 [2024-11-02 12:07:58.730543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.755 [2024-11-02 12:07:58.730596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.756 [2024-11-02 12:07:58.730610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.015 #69 NEW cov: 11894 ft: 15129 corp: 37/1689b lim: 105 exec/s: 69 rss: 69Mb L: 87/105 MS: 1 ShuffleBytes- 00:08:12.015 [2024-11-02 12:07:58.770589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12659530243895177135 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.770615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.770671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12659530246663417775 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.770685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.770740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.770755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.770807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18374686483966590975 len:2049 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.770822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.770877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:12659530243715956735 len:44976 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.770893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.015 #70 NEW cov: 11894 ft: 15139 corp: 38/1794b lim: 105 exec/s: 70 rss: 69Mb L: 105/105 MS: 1 CrossOver- 00:08:12.015 [2024-11-02 12:07:58.810231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.810258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.015 #71 NEW cov: 11894 ft: 15156 corp: 39/1833b lim: 105 exec/s: 71 rss: 70Mb L: 39/105 MS: 1 ChangeBinInt- 00:08:12.015 [2024-11-02 12:07:58.850431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:177939152960 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.850458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.850495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.850511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.015 #72 NEW cov: 11894 ft: 15171 corp: 40/1884b lim: 105 exec/s: 72 rss: 70Mb L: 51/105 MS: 1 InsertByte- 00:08:12.015 [2024-11-02 12:07:58.890561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.890590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.890629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4467570830351532032 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.890645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.015 #73 NEW cov: 11894 ft: 15179 corp: 41/1946b lim: 105 exec/s: 73 rss: 70Mb L: 62/105 MS: 1 ChangeBinInt- 00:08:12.015 [2024-11-02 12:07:58.930551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1845493824 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.930579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.015 #79 NEW cov: 11894 ft: 15192 corp: 42/1976b lim: 105 exec/s: 79 rss: 70Mb L: 30/105 MS: 1 CrossOver- 00:08:12.015 [2024-11-02 12:07:58.971075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.971103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.971167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.971187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.971240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.015 [2024-11-02 12:07:58.971255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.015 [2024-11-02 12:07:58.971306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.016 [2024-11-02 12:07:58.971322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.275 #80 NEW cov: 11894 ft: 15205 corp: 43/2064b lim: 105 exec/s: 80 rss: 70Mb L: 88/105 MS: 1 InsertByte- 00:08:12.275 [2024-11-02 12:07:59.010793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.275 [2024-11-02 12:07:59.010822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.275 #81 NEW cov: 11894 ft: 15286 corp: 44/2104b lim: 105 exec/s: 40 rss: 70Mb L: 40/105 MS: 1 ChangeBinInt- 00:08:12.275 #81 DONE cov: 11894 ft: 15286 corp: 44/2104b lim: 105 exec/s: 40 rss: 70Mb 00:08:12.275 Done 81 runs in 2 second(s) 00:08:12.275 12:07:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:12.275 12:07:59 -- ../common.sh@72 -- # (( i++ )) 00:08:12.275 12:07:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.275 12:07:59 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:12.275 12:07:59 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:12.275 12:07:59 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.275 12:07:59 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.275 12:07:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:12.275 12:07:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:12.275 12:07:59 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:12.275 12:07:59 -- nvmf/run.sh@29 -- # port=4417 00:08:12.275 12:07:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:12.275 12:07:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:12.275 12:07:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.275 12:07:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:12.275 [2024-11-02 12:07:59.185901] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:12.275 [2024-11-02 12:07:59.185980] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1151550 ] 00:08:12.275 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.534 [2024-11-02 12:07:59.435631] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.534 [2024-11-02 12:07:59.464478] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.534 [2024-11-02 12:07:59.464616] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.802 [2024-11-02 12:07:59.515939] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.802 [2024-11-02 12:07:59.532307] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:12.802 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.802 INFO: Seed: 1895956665 00:08:12.802 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:12.802 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:12.802 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:12.802 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.802 #2 INITED exec/s: 0 rss: 59Mb 00:08:12.802 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.802 This may also happen if the target rejected all inputs we tried so far 00:08:12.802 [2024-11-02 12:07:59.581370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.802 [2024-11-02 12:07:59.581400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.802 [2024-11-02 12:07:59.581464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.802 [2024-11-02 12:07:59.581479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.061 NEW_FUNC[1/671]: 0x46b1b8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:13.061 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.061 #5 NEW cov: 11686 ft: 11687 corp: 2/50b lim: 120 exec/s: 0 rss: 67Mb L: 49/49 MS: 3 CopyPart-InsertByte-InsertRepeatedBytes- 00:08:13.061 [2024-11-02 12:07:59.892197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:07:59.892231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.061 [2024-11-02 12:07:59.892285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:11212726789901884315 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:07:59.892302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.061 NEW_FUNC[1/1]: 0x16ca2c8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:08:13.061 #11 NEW cov: 11801 ft: 12148 corp: 3/118b lim: 120 exec/s: 0 rss: 67Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:08:13.061 [2024-11-02 12:07:59.942473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:07:59.942501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.061 [2024-11-02 12:07:59.942553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:07:59.942572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.061 [2024-11-02 12:07:59.942629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:07:59.942645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.061 #12 NEW cov: 11807 ft: 12805 corp: 4/203b lim: 120 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 CopyPart- 00:08:13.061 [2024-11-02 12:07:59.982443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:07:59.982472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.061 [2024-11-02 12:07:59.982522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:07:59.982540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.061 #13 NEW cov: 11892 ft: 13217 corp: 5/252b lim: 120 exec/s: 0 rss: 67Mb L: 49/85 MS: 1 CrossOver- 00:08:13.061 [2024-11-02 12:08:00.022589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:08:00.022619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.061 [2024-11-02 12:08:00.022663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.061 [2024-11-02 12:08:00.022679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.319 #14 NEW cov: 11892 ft: 13288 corp: 6/302b lim: 120 exec/s: 0 rss: 67Mb L: 50/85 MS: 1 InsertByte- 00:08:13.319 [2024-11-02 12:08:00.062703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.319 [2024-11-02 12:08:00.062733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.319 [2024-11-02 12:08:00.062773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.319 [2024-11-02 12:08:00.062789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.319 #20 NEW cov: 11892 ft: 13435 corp: 7/352b lim: 120 exec/s: 0 rss: 67Mb L: 50/85 MS: 1 CrossOver- 00:08:13.319 [2024-11-02 12:08:00.102839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.102871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.320 [2024-11-02 12:08:00.102940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:17592186044416 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.102955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.320 #21 NEW cov: 11892 ft: 13499 corp: 8/401b lim: 120 exec/s: 0 rss: 67Mb L: 49/85 MS: 1 ChangeBit- 00:08:13.320 [2024-11-02 12:08:00.143088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.143119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.320 [2024-11-02 12:08:00.143155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.143170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.320 [2024-11-02 12:08:00.143224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.143239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.320 #22 NEW cov: 11892 ft: 13564 corp: 9/486b lim: 120 exec/s: 0 rss: 67Mb L: 85/85 MS: 1 ChangeBit- 00:08:13.320 [2024-11-02 12:08:00.192881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.192909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.320 #23 NEW cov: 11892 ft: 14468 corp: 10/521b lim: 120 exec/s: 0 rss: 68Mb L: 35/85 MS: 1 EraseBytes- 00:08:13.320 [2024-11-02 12:08:00.243239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.243266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.320 [2024-11-02 12:08:00.243301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.243317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.320 #24 NEW cov: 11892 ft: 14496 corp: 11/570b lim: 120 exec/s: 0 rss: 68Mb L: 49/85 MS: 1 ShuffleBytes- 00:08:13.320 [2024-11-02 12:08:00.283328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.283355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.320 [2024-11-02 12:08:00.283417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.320 [2024-11-02 12:08:00.283433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.579 #25 NEW cov: 11892 ft: 14552 corp: 12/620b lim: 120 exec/s: 0 rss: 68Mb L: 50/85 MS: 1 ChangeBinInt- 00:08:13.579 [2024-11-02 12:08:00.323285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4211081216 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.323312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.579 #26 NEW cov: 11892 ft: 14650 corp: 13/655b lim: 120 exec/s: 0 rss: 68Mb L: 35/85 MS: 1 ChangeByte- 00:08:13.579 [2024-11-02 12:08:00.373600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.373630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.579 [2024-11-02 12:08:00.373669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.373685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.579 #27 NEW cov: 11892 ft: 14728 corp: 14/705b lim: 120 exec/s: 0 rss: 68Mb L: 50/85 MS: 1 ChangeByte- 00:08:13.579 [2024-11-02 12:08:00.413735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.413764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.579 [2024-11-02 12:08:00.413815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.413831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.579 #28 NEW cov: 11892 ft: 14792 corp: 15/755b lim: 120 exec/s: 0 rss: 68Mb L: 50/85 MS: 1 ChangeBinInt- 00:08:13.579 [2024-11-02 12:08:00.453829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.453856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.579 [2024-11-02 12:08:00.453891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.453906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.579 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.579 #29 NEW cov: 11915 ft: 14824 corp: 16/806b lim: 120 exec/s: 0 rss: 68Mb L: 51/85 MS: 1 InsertByte- 00:08:13.579 [2024-11-02 12:08:00.503980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.504013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.579 [2024-11-02 12:08:00.504073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5505024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.504089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.579 #30 NEW cov: 11915 ft: 14833 corp: 17/856b lim: 120 exec/s: 0 rss: 68Mb L: 50/85 MS: 1 ChangeByte- 00:08:13.579 [2024-11-02 12:08:00.544228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.544256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.579 [2024-11-02 12:08:00.544294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.544310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.579 [2024-11-02 12:08:00.544366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.579 [2024-11-02 12:08:00.544381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.839 #31 NEW cov: 11915 ft: 14896 corp: 18/949b lim: 120 exec/s: 31 rss: 68Mb L: 93/93 MS: 1 InsertRepeatedBytes- 00:08:13.839 [2024-11-02 12:08:00.584061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.584088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.839 #32 NEW cov: 11915 ft: 14971 corp: 19/996b lim: 120 exec/s: 32 rss: 68Mb L: 47/93 MS: 1 EraseBytes- 00:08:13.839 [2024-11-02 12:08:00.624354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.624381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.839 [2024-11-02 12:08:00.624430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.624446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.839 #33 NEW cov: 11915 ft: 14994 corp: 20/1046b lim: 120 exec/s: 33 rss: 68Mb L: 50/93 MS: 1 InsertByte- 00:08:13.839 [2024-11-02 12:08:00.664455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.664482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.839 [2024-11-02 12:08:00.664531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:524288 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.664547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.839 #34 NEW cov: 11915 ft: 15001 corp: 21/1095b lim: 120 exec/s: 34 rss: 68Mb L: 49/93 MS: 1 ChangeBit- 00:08:13.839 [2024-11-02 12:08:00.704382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.704410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.839 #35 NEW cov: 11915 ft: 15033 corp: 22/1130b lim: 120 exec/s: 35 rss: 68Mb L: 35/93 MS: 1 ShuffleBytes- 00:08:13.839 [2024-11-02 12:08:00.744670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.744697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.839 [2024-11-02 12:08:00.744734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.744749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.839 #36 NEW cov: 11915 ft: 15055 corp: 23/1179b lim: 120 exec/s: 36 rss: 68Mb L: 49/93 MS: 1 ChangeByte- 00:08:13.839 [2024-11-02 12:08:00.774762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.774789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.839 [2024-11-02 12:08:00.774828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.774844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.839 #37 NEW cov: 11915 ft: 15064 corp: 24/1228b lim: 120 exec/s: 37 rss: 68Mb L: 49/93 MS: 1 ChangeBit- 00:08:13.839 [2024-11-02 12:08:00.814857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.839 [2024-11-02 12:08:00.814885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.814923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.814940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.099 #38 NEW cov: 11915 ft: 15072 corp: 25/1277b lim: 120 exec/s: 38 rss: 68Mb L: 49/93 MS: 1 ChangeByte- 00:08:14.099 [2024-11-02 12:08:00.854953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.854981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.855035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.855051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.099 #39 NEW cov: 11915 ft: 15090 corp: 26/1326b lim: 120 exec/s: 39 rss: 68Mb L: 49/93 MS: 1 ChangeBit- 00:08:14.099 [2024-11-02 12:08:00.895261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.895287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.895340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.895355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.895413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.895430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.099 #40 NEW cov: 11915 ft: 15113 corp: 27/1411b lim: 120 exec/s: 40 rss: 68Mb L: 85/93 MS: 1 ChangeBit- 00:08:14.099 [2024-11-02 12:08:00.935524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.935552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.935599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.935615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.935670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.935701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.935755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.935771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.099 #41 NEW cov: 11915 ft: 15484 corp: 28/1523b lim: 120 exec/s: 41 rss: 68Mb L: 112/112 MS: 1 InsertRepeatedBytes- 00:08:14.099 [2024-11-02 12:08:00.985355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.985382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:00.985436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:00.985453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.099 #42 NEW cov: 11915 ft: 15503 corp: 29/1573b lim: 120 exec/s: 42 rss: 68Mb L: 50/112 MS: 1 InsertByte- 00:08:14.099 [2024-11-02 12:08:01.025511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:01.025539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:01.025593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2269391999729664 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:01.025611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.099 #43 NEW cov: 11915 ft: 15514 corp: 30/1622b lim: 120 exec/s: 43 rss: 69Mb L: 49/112 MS: 1 ChangeBit- 00:08:14.099 [2024-11-02 12:08:01.065743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:01.065770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:01.065804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:01.065819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.099 [2024-11-02 12:08:01.065878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.099 [2024-11-02 12:08:01.065894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.359 #44 NEW cov: 11915 ft: 15565 corp: 31/1707b lim: 120 exec/s: 44 rss: 69Mb L: 85/112 MS: 1 ChangeByte- 00:08:14.359 [2024-11-02 12:08:01.106032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.106060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.106103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.106116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.106168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.106183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.106238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.106253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.359 #45 NEW cov: 11915 ft: 15574 corp: 32/1819b lim: 120 exec/s: 45 rss: 69Mb L: 112/112 MS: 1 ChangeByte- 00:08:14.359 [2024-11-02 12:08:01.145724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.145751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.359 #46 NEW cov: 11915 ft: 15583 corp: 33/1854b lim: 120 exec/s: 46 rss: 69Mb L: 35/112 MS: 1 ChangeBinInt- 00:08:14.359 [2024-11-02 12:08:01.185801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.185828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.359 #47 NEW cov: 11915 ft: 15590 corp: 34/1890b lim: 120 exec/s: 47 rss: 69Mb L: 36/112 MS: 1 InsertByte- 00:08:14.359 [2024-11-02 12:08:01.226266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.226292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.226345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.226361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.226416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.226431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.359 #48 NEW cov: 11915 ft: 15594 corp: 35/1975b lim: 120 exec/s: 48 rss: 69Mb L: 85/112 MS: 1 CopyPart- 00:08:14.359 [2024-11-02 12:08:01.266253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.266281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.266340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2269391999729664 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.266357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.359 #49 NEW cov: 11915 ft: 15601 corp: 36/2024b lim: 120 exec/s: 49 rss: 69Mb L: 49/112 MS: 1 ShuffleBytes- 00:08:14.359 [2024-11-02 12:08:01.306656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.306685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.306734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15191436295996101330 len:53971 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.306750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.306803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744069414584575 len:65281 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.306819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.359 [2024-11-02 12:08:01.306875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:6052837899185946624 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.359 [2024-11-02 12:08:01.306890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.618 #50 NEW cov: 11915 ft: 15620 corp: 37/2125b lim: 120 exec/s: 50 rss: 69Mb L: 101/112 MS: 1 InsertRepeatedBytes- 00:08:14.619 [2024-11-02 12:08:01.356670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.356699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.619 [2024-11-02 12:08:01.356733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.356749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.619 [2024-11-02 12:08:01.356806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.356823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.619 #51 NEW cov: 11915 ft: 15629 corp: 38/2210b lim: 120 exec/s: 51 rss: 69Mb L: 85/112 MS: 1 ChangeByte- 00:08:14.619 [2024-11-02 12:08:01.396562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.396590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.619 [2024-11-02 12:08:01.396640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.396654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.619 #52 NEW cov: 11915 ft: 15633 corp: 39/2259b lim: 120 exec/s: 52 rss: 69Mb L: 49/112 MS: 1 ShuffleBytes- 00:08:14.619 [2024-11-02 12:08:01.436551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.436578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.619 #53 NEW cov: 11915 ft: 15664 corp: 40/2290b lim: 120 exec/s: 53 rss: 69Mb L: 31/112 MS: 1 EraseBytes- 00:08:14.619 [2024-11-02 12:08:01.476664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637534208 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.476690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.619 #54 NEW cov: 11915 ft: 15690 corp: 41/2326b lim: 120 exec/s: 54 rss: 69Mb L: 36/112 MS: 1 ChangeBinInt- 00:08:14.619 [2024-11-02 12:08:01.517141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.517167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.619 [2024-11-02 12:08:01.517206] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.517221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.619 [2024-11-02 12:08:01.517276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.517291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.619 #55 NEW cov: 11915 ft: 15709 corp: 42/2412b lim: 120 exec/s: 55 rss: 69Mb L: 86/112 MS: 1 InsertByte- 00:08:14.619 [2024-11-02 12:08:01.557083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:637566976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.557110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.619 [2024-11-02 12:08:01.557172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.619 [2024-11-02 12:08:01.557188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.619 #56 NEW cov: 11915 ft: 15738 corp: 43/2461b lim: 120 exec/s: 28 rss: 69Mb L: 49/112 MS: 1 ChangeBit- 00:08:14.619 #56 DONE cov: 11915 ft: 15738 corp: 43/2461b lim: 120 exec/s: 28 rss: 69Mb 00:08:14.619 Done 56 runs in 2 second(s) 00:08:14.878 12:08:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:14.878 12:08:01 -- ../common.sh@72 -- # (( i++ )) 00:08:14.878 12:08:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.878 12:08:01 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:14.878 12:08:01 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:14.878 12:08:01 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.878 12:08:01 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.878 12:08:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:14.878 12:08:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:14.879 12:08:01 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:14.879 12:08:01 -- nvmf/run.sh@29 -- # port=4418 00:08:14.879 12:08:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:14.879 12:08:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:14.879 12:08:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.879 12:08:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:14.879 [2024-11-02 12:08:01.733327] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:14.879 [2024-11-02 12:08:01.733397] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1151943 ] 00:08:14.879 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.138 [2024-11-02 12:08:01.982409] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.138 [2024-11-02 12:08:02.010270] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.138 [2024-11-02 12:08:02.010404] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.138 [2024-11-02 12:08:02.061874] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.138 [2024-11-02 12:08:02.078259] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:15.138 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.138 INFO: Seed: 149014285 00:08:15.138 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:15.138 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:15.138 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:15.138 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.138 #2 INITED exec/s: 0 rss: 59Mb 00:08:15.138 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.138 This may also happen if the target rejected all inputs we tried so far 00:08:15.396 [2024-11-02 12:08:02.133461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.396 [2024-11-02 12:08:02.133490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.396 [2024-11-02 12:08:02.133539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.396 [2024-11-02 12:08:02.133554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.655 NEW_FUNC[1/669]: 0x46ea18 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:15.655 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.655 #3 NEW cov: 11624 ft: 11633 corp: 2/49b lim: 100 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:15.655 [2024-11-02 12:08:02.434075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.655 [2024-11-02 12:08:02.434108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.655 [2024-11-02 12:08:02.434159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.655 [2024-11-02 12:08:02.434175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.656 NEW_FUNC[1/1]: 0x1290a68 in nvmf_transport_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:723 00:08:15.656 #14 NEW cov: 11745 ft: 12177 corp: 3/98b lim: 100 exec/s: 0 rss: 67Mb L: 49/49 MS: 1 CrossOver- 00:08:15.656 [2024-11-02 12:08:02.474248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.656 [2024-11-02 12:08:02.474274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.656 [2024-11-02 12:08:02.474309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.656 [2024-11-02 12:08:02.474323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.656 [2024-11-02 12:08:02.474371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.656 [2024-11-02 12:08:02.474388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.656 #17 NEW cov: 11751 ft: 12671 corp: 4/164b lim: 100 exec/s: 0 rss: 67Mb L: 66/66 MS: 3 CMP-ChangeByte-InsertRepeatedBytes- DE: "\001\1779\371\230z\251\324"- 00:08:15.656 [2024-11-02 12:08:02.514196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.656 [2024-11-02 12:08:02.514223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.656 [2024-11-02 12:08:02.514260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.656 [2024-11-02 12:08:02.514274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.656 #18 NEW cov: 11836 ft: 12989 corp: 5/204b lim: 100 exec/s: 0 rss: 67Mb L: 40/66 MS: 1 EraseBytes- 00:08:15.656 [2024-11-02 12:08:02.554361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.656 [2024-11-02 12:08:02.554387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.656 [2024-11-02 12:08:02.554432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.656 [2024-11-02 12:08:02.554446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.656 #19 NEW cov: 11836 ft: 13066 corp: 6/252b lim: 100 exec/s: 0 rss: 67Mb L: 48/66 MS: 1 ShuffleBytes- 00:08:15.656 [2024-11-02 12:08:02.594394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.656 [2024-11-02 12:08:02.594421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.656 [2024-11-02 12:08:02.594457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.656 [2024-11-02 12:08:02.594471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.656 #20 NEW cov: 11836 ft: 13117 corp: 7/292b lim: 100 exec/s: 0 rss: 67Mb L: 40/66 MS: 1 ChangeByte- 00:08:15.915 [2024-11-02 12:08:02.634640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.915 [2024-11-02 12:08:02.634666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.634710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.915 [2024-11-02 12:08:02.634724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.634773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.915 [2024-11-02 12:08:02.634787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.915 #21 NEW cov: 11836 ft: 13162 corp: 8/358b lim: 100 exec/s: 0 rss: 67Mb L: 66/66 MS: 1 ChangeBinInt- 00:08:15.915 [2024-11-02 12:08:02.674861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.915 [2024-11-02 12:08:02.674888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.674930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.915 [2024-11-02 12:08:02.674943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.674991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.915 [2024-11-02 12:08:02.675010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.675064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:15.915 [2024-11-02 12:08:02.675077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.915 #22 NEW cov: 11836 ft: 13547 corp: 9/452b lim: 100 exec/s: 0 rss: 67Mb L: 94/94 MS: 1 CrossOver- 00:08:15.915 [2024-11-02 12:08:02.714761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.915 [2024-11-02 12:08:02.714788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.714823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.915 [2024-11-02 12:08:02.714837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.915 #23 NEW cov: 11836 ft: 13577 corp: 10/501b lim: 100 exec/s: 0 rss: 68Mb L: 49/94 MS: 1 ChangeBinInt- 00:08:15.915 [2024-11-02 12:08:02.755121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.915 [2024-11-02 12:08:02.755147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.755211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.915 [2024-11-02 12:08:02.755225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.755274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.915 [2024-11-02 12:08:02.755289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.755338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:15.915 [2024-11-02 12:08:02.755352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.915 #24 NEW cov: 11836 ft: 13593 corp: 11/599b lim: 100 exec/s: 0 rss: 68Mb L: 98/98 MS: 1 CopyPart- 00:08:15.915 [2024-11-02 12:08:02.795168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.915 [2024-11-02 12:08:02.795194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.795228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.915 [2024-11-02 12:08:02.795242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.915 [2024-11-02 12:08:02.795291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:15.915 [2024-11-02 12:08:02.795304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.915 #25 NEW cov: 11836 ft: 13646 corp: 12/665b lim: 100 exec/s: 0 rss: 68Mb L: 66/98 MS: 1 ShuffleBytes- 00:08:15.915 [2024-11-02 12:08:02.835157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.916 [2024-11-02 12:08:02.835182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.916 [2024-11-02 12:08:02.835242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.916 [2024-11-02 12:08:02.835256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.916 #26 NEW cov: 11836 ft: 13675 corp: 13/714b lim: 100 exec/s: 0 rss: 68Mb L: 49/98 MS: 1 ChangeBit- 00:08:15.916 [2024-11-02 12:08:02.875273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:15.916 [2024-11-02 12:08:02.875302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.916 [2024-11-02 12:08:02.875354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:15.916 [2024-11-02 12:08:02.875368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.175 #27 NEW cov: 11836 ft: 13688 corp: 14/763b lim: 100 exec/s: 0 rss: 68Mb L: 49/98 MS: 1 ChangeBinInt- 00:08:16.175 [2024-11-02 12:08:02.915589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.175 [2024-11-02 12:08:02.915614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:02.915659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.175 [2024-11-02 12:08:02.915673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:02.915724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.175 [2024-11-02 12:08:02.915738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:02.915790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.175 [2024-11-02 12:08:02.915803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.175 #28 NEW cov: 11836 ft: 13709 corp: 15/848b lim: 100 exec/s: 0 rss: 68Mb L: 85/98 MS: 1 InsertRepeatedBytes- 00:08:16.175 [2024-11-02 12:08:02.955581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.175 [2024-11-02 12:08:02.955607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:02.955648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.175 [2024-11-02 12:08:02.955662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:02.955711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.175 [2024-11-02 12:08:02.955724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.175 #29 NEW cov: 11836 ft: 13791 corp: 16/922b lim: 100 exec/s: 0 rss: 68Mb L: 74/98 MS: 1 EraseBytes- 00:08:16.175 [2024-11-02 12:08:02.995502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.175 [2024-11-02 12:08:02.995527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.175 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.175 #30 NEW cov: 11859 ft: 14183 corp: 17/959b lim: 100 exec/s: 0 rss: 68Mb L: 37/98 MS: 1 EraseBytes- 00:08:16.175 [2024-11-02 12:08:03.035791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.175 [2024-11-02 12:08:03.035817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.035853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.175 [2024-11-02 12:08:03.035866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.035914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.175 [2024-11-02 12:08:03.035928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.175 #31 NEW cov: 11859 ft: 14211 corp: 18/1025b lim: 100 exec/s: 0 rss: 68Mb L: 66/98 MS: 1 ChangeBinInt- 00:08:16.175 [2024-11-02 12:08:03.076080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.175 [2024-11-02 12:08:03.076107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.076153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.175 [2024-11-02 12:08:03.076167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.076216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.175 [2024-11-02 12:08:03.076230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.076282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.175 [2024-11-02 12:08:03.076296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.175 #32 NEW cov: 11859 ft: 14219 corp: 19/1109b lim: 100 exec/s: 0 rss: 68Mb L: 84/98 MS: 1 InsertRepeatedBytes- 00:08:16.175 [2024-11-02 12:08:03.116162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.175 [2024-11-02 12:08:03.116187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.116235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.175 [2024-11-02 12:08:03.116248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.116298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.175 [2024-11-02 12:08:03.116312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.175 [2024-11-02 12:08:03.116360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.176 [2024-11-02 12:08:03.116374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.176 #33 NEW cov: 11859 ft: 14224 corp: 20/1194b lim: 100 exec/s: 33 rss: 68Mb L: 85/98 MS: 1 ShuffleBytes- 00:08:16.435 [2024-11-02 12:08:03.156083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.435 [2024-11-02 12:08:03.156109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.156157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.435 [2024-11-02 12:08:03.156171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.435 #34 NEW cov: 11859 ft: 14237 corp: 21/1251b lim: 100 exec/s: 34 rss: 68Mb L: 57/98 MS: 1 EraseBytes- 00:08:16.435 [2024-11-02 12:08:03.196268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.435 [2024-11-02 12:08:03.196294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.196339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.435 [2024-11-02 12:08:03.196352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.196405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.435 [2024-11-02 12:08:03.196422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.435 #35 NEW cov: 11859 ft: 14269 corp: 22/1317b lim: 100 exec/s: 35 rss: 68Mb L: 66/98 MS: 1 CopyPart- 00:08:16.435 [2024-11-02 12:08:03.236276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.435 [2024-11-02 12:08:03.236302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.236334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.435 [2024-11-02 12:08:03.236348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.435 #41 NEW cov: 11859 ft: 14302 corp: 23/1366b lim: 100 exec/s: 41 rss: 68Mb L: 49/98 MS: 1 InsertByte- 00:08:16.435 [2024-11-02 12:08:03.266408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.435 [2024-11-02 12:08:03.266434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.266485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.435 [2024-11-02 12:08:03.266500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.435 #42 NEW cov: 11859 ft: 14341 corp: 24/1423b lim: 100 exec/s: 42 rss: 68Mb L: 57/98 MS: 1 ChangeByte- 00:08:16.435 [2024-11-02 12:08:03.306590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.435 [2024-11-02 12:08:03.306615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.306656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.435 [2024-11-02 12:08:03.306670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.306719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.435 [2024-11-02 12:08:03.306732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.435 #43 NEW cov: 11859 ft: 14342 corp: 25/1489b lim: 100 exec/s: 43 rss: 68Mb L: 66/98 MS: 1 ChangeBinInt- 00:08:16.435 [2024-11-02 12:08:03.346818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.435 [2024-11-02 12:08:03.346843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.346883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.435 [2024-11-02 12:08:03.346897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.346946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.435 [2024-11-02 12:08:03.346959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.347011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.435 [2024-11-02 12:08:03.347041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.435 #44 NEW cov: 11859 ft: 14370 corp: 26/1571b lim: 100 exec/s: 44 rss: 69Mb L: 82/98 MS: 1 CrossOver- 00:08:16.435 [2024-11-02 12:08:03.386948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.435 [2024-11-02 12:08:03.386974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.387012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.435 [2024-11-02 12:08:03.387042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.387091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.435 [2024-11-02 12:08:03.387104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.435 [2024-11-02 12:08:03.387153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.435 [2024-11-02 12:08:03.387167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.695 #45 NEW cov: 11859 ft: 14390 corp: 27/1653b lim: 100 exec/s: 45 rss: 69Mb L: 82/98 MS: 1 ChangeBinInt- 00:08:16.695 [2024-11-02 12:08:03.426847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.695 [2024-11-02 12:08:03.426873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.426937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.695 [2024-11-02 12:08:03.426951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.695 #46 NEW cov: 11859 ft: 14410 corp: 28/1710b lim: 100 exec/s: 46 rss: 69Mb L: 57/98 MS: 1 CopyPart- 00:08:16.695 [2024-11-02 12:08:03.456845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.695 [2024-11-02 12:08:03.456871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.695 #47 NEW cov: 11859 ft: 14435 corp: 29/1735b lim: 100 exec/s: 47 rss: 69Mb L: 25/98 MS: 1 EraseBytes- 00:08:16.695 [2024-11-02 12:08:03.497040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.695 [2024-11-02 12:08:03.497067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.497127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.695 [2024-11-02 12:08:03.497141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.695 #48 NEW cov: 11859 ft: 14446 corp: 30/1784b lim: 100 exec/s: 48 rss: 69Mb L: 49/98 MS: 1 CrossOver- 00:08:16.695 [2024-11-02 12:08:03.537274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.695 [2024-11-02 12:08:03.537302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.537335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.695 [2024-11-02 12:08:03.537349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.537397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.695 [2024-11-02 12:08:03.537410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.695 #49 NEW cov: 11859 ft: 14454 corp: 31/1850b lim: 100 exec/s: 49 rss: 69Mb L: 66/98 MS: 1 PersAutoDict- DE: "\001\1779\371\230z\251\324"- 00:08:16.695 [2024-11-02 12:08:03.577351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.695 [2024-11-02 12:08:03.577378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.577432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.695 [2024-11-02 12:08:03.577447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.695 #50 NEW cov: 11859 ft: 14468 corp: 32/1899b lim: 100 exec/s: 50 rss: 69Mb L: 49/98 MS: 1 ChangeBit- 00:08:16.695 [2024-11-02 12:08:03.617503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.695 [2024-11-02 12:08:03.617529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.617586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.695 [2024-11-02 12:08:03.617601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.617652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.695 [2024-11-02 12:08:03.617666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.695 #51 NEW cov: 11859 ft: 14471 corp: 33/1965b lim: 100 exec/s: 51 rss: 69Mb L: 66/98 MS: 1 CopyPart- 00:08:16.695 [2024-11-02 12:08:03.657739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.695 [2024-11-02 12:08:03.657766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.657829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.695 [2024-11-02 12:08:03.657843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.657891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.695 [2024-11-02 12:08:03.657906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.695 [2024-11-02 12:08:03.657952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.695 [2024-11-02 12:08:03.657965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.954 #52 NEW cov: 11859 ft: 14493 corp: 34/2054b lim: 100 exec/s: 52 rss: 69Mb L: 89/98 MS: 1 InsertRepeatedBytes- 00:08:16.954 [2024-11-02 12:08:03.697880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.954 [2024-11-02 12:08:03.697908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.954 [2024-11-02 12:08:03.697944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.954 [2024-11-02 12:08:03.697959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.954 [2024-11-02 12:08:03.698012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.954 [2024-11-02 12:08:03.698025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.954 [2024-11-02 12:08:03.698073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.954 [2024-11-02 12:08:03.698085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.954 #53 NEW cov: 11859 ft: 14512 corp: 35/2139b lim: 100 exec/s: 53 rss: 69Mb L: 85/98 MS: 1 CrossOver- 00:08:16.954 [2024-11-02 12:08:03.737752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.954 [2024-11-02 12:08:03.737777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.954 [2024-11-02 12:08:03.737830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.954 [2024-11-02 12:08:03.737844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.954 #54 NEW cov: 11859 ft: 14544 corp: 36/2188b lim: 100 exec/s: 54 rss: 69Mb L: 49/98 MS: 1 ChangeBinInt- 00:08:16.954 [2024-11-02 12:08:03.778104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.955 [2024-11-02 12:08:03.778131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.778169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.955 [2024-11-02 12:08:03.778180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.778228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.955 [2024-11-02 12:08:03.778241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.778290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.955 [2024-11-02 12:08:03.778303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.955 #55 NEW cov: 11859 ft: 14598 corp: 37/2282b lim: 100 exec/s: 55 rss: 69Mb L: 94/98 MS: 1 CopyPart- 00:08:16.955 [2024-11-02 12:08:03.818214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.955 [2024-11-02 12:08:03.818240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.818295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.955 [2024-11-02 12:08:03.818309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.818360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:16.955 [2024-11-02 12:08:03.818374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.818426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:16.955 [2024-11-02 12:08:03.818439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.955 #56 NEW cov: 11859 ft: 14606 corp: 38/2380b lim: 100 exec/s: 56 rss: 69Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:16.955 [2024-11-02 12:08:03.858128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.955 [2024-11-02 12:08:03.858153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.858213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.955 [2024-11-02 12:08:03.858228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.955 #57 NEW cov: 11859 ft: 14619 corp: 39/2437b lim: 100 exec/s: 57 rss: 69Mb L: 57/98 MS: 1 CopyPart- 00:08:16.955 [2024-11-02 12:08:03.898175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.955 [2024-11-02 12:08:03.898201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.898248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.955 [2024-11-02 12:08:03.898260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.955 #58 NEW cov: 11859 ft: 14638 corp: 40/2485b lim: 100 exec/s: 58 rss: 69Mb L: 48/98 MS: 1 CrossOver- 00:08:16.955 [2024-11-02 12:08:03.928238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:16.955 [2024-11-02 12:08:03.928266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.955 [2024-11-02 12:08:03.928304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:16.955 [2024-11-02 12:08:03.928318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.214 #59 NEW cov: 11859 ft: 14640 corp: 41/2540b lim: 100 exec/s: 59 rss: 70Mb L: 55/98 MS: 1 CrossOver- 00:08:17.215 [2024-11-02 12:08:03.968482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:17.215 [2024-11-02 12:08:03.968508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:03.968543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:17.215 [2024-11-02 12:08:03.968557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:03.968608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:17.215 [2024-11-02 12:08:03.968621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.215 #60 NEW cov: 11859 ft: 14647 corp: 42/2612b lim: 100 exec/s: 60 rss: 70Mb L: 72/98 MS: 1 CrossOver- 00:08:17.215 [2024-11-02 12:08:04.008515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:17.215 [2024-11-02 12:08:04.008540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:04.008574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:17.215 [2024-11-02 12:08:04.008588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.215 #61 NEW cov: 11859 ft: 14682 corp: 43/2661b lim: 100 exec/s: 61 rss: 70Mb L: 49/98 MS: 1 ChangeBit- 00:08:17.215 [2024-11-02 12:08:04.048636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:17.215 [2024-11-02 12:08:04.048662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:04.048712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:17.215 [2024-11-02 12:08:04.048726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.215 #62 NEW cov: 11859 ft: 14734 corp: 44/2701b lim: 100 exec/s: 62 rss: 70Mb L: 40/98 MS: 1 ShuffleBytes- 00:08:17.215 [2024-11-02 12:08:04.088948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:17.215 [2024-11-02 12:08:04.088975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:04.089022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:17.215 [2024-11-02 12:08:04.089036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:04.089084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:17.215 [2024-11-02 12:08:04.089097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:04.089166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:17.215 [2024-11-02 12:08:04.089179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.215 #63 NEW cov: 11859 ft: 14735 corp: 45/2786b lim: 100 exec/s: 63 rss: 70Mb L: 85/98 MS: 1 ChangeBit- 00:08:17.215 [2024-11-02 12:08:04.128848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:17.215 [2024-11-02 12:08:04.128875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.215 [2024-11-02 12:08:04.128930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:17.215 [2024-11-02 12:08:04.128945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.215 #64 pulse cov: 11859 ft: 14752 corp: 45/2786b lim: 100 exec/s: 32 rss: 70Mb 00:08:17.215 #64 NEW cov: 11859 ft: 14752 corp: 46/2826b lim: 100 exec/s: 32 rss: 70Mb L: 40/98 MS: 1 EraseBytes- 00:08:17.215 #64 DONE cov: 11859 ft: 14752 corp: 46/2826b lim: 100 exec/s: 32 rss: 70Mb 00:08:17.215 ###### Recommended dictionary. ###### 00:08:17.215 "\001\1779\371\230z\251\324" # Uses: 1 00:08:17.215 ###### End of recommended dictionary. ###### 00:08:17.215 Done 64 runs in 2 second(s) 00:08:17.474 12:08:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:17.474 12:08:04 -- ../common.sh@72 -- # (( i++ )) 00:08:17.474 12:08:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.474 12:08:04 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:17.474 12:08:04 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:17.474 12:08:04 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.474 12:08:04 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.474 12:08:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:17.474 12:08:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:17.474 12:08:04 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:17.474 12:08:04 -- nvmf/run.sh@29 -- # port=4419 00:08:17.474 12:08:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:17.474 12:08:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:17.474 12:08:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.474 12:08:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:17.474 [2024-11-02 12:08:04.304037] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:17.474 [2024-11-02 12:08:04.304109] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1152486 ] 00:08:17.474 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.733 [2024-11-02 12:08:04.553940] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.733 [2024-11-02 12:08:04.582708] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.733 [2024-11-02 12:08:04.582847] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.733 [2024-11-02 12:08:04.634090] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:17.733 [2024-11-02 12:08:04.650434] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:17.733 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.733 INFO: Seed: 2720004254 00:08:17.733 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:17.733 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:17.733 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:17.733 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.733 #2 INITED exec/s: 0 rss: 59Mb 00:08:17.733 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.733 This may also happen if the target rejected all inputs we tried so far 00:08:17.733 [2024-11-02 12:08:04.705775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:17.733 [2024-11-02 12:08:04.705806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.733 [2024-11-02 12:08:04.705844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:17.733 [2024-11-02 12:08:04.705859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.733 [2024-11-02 12:08:04.705908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:17.733 [2024-11-02 12:08:04.705924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.733 [2024-11-02 12:08:04.705976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:17.733 [2024-11-02 12:08:04.705990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.252 NEW_FUNC[1/669]: 0x4719d8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:18.252 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.252 #5 NEW cov: 11597 ft: 11611 corp: 2/46b lim: 50 exec/s: 0 rss: 67Mb L: 45/45 MS: 3 ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:08:18.252 [2024-11-02 12:08:05.026657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.252 [2024-11-02 12:08:05.026700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.026765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.026787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.026849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.026869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.026931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.026952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.252 NEW_FUNC[1/1]: 0x1c78a18 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1310 00:08:18.252 #6 NEW cov: 11723 ft: 12180 corp: 3/94b lim: 50 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:18.252 [2024-11-02 12:08:05.076522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:18.252 [2024-11-02 12:08:05.076551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.076601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.076618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.076674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.076691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.252 #7 NEW cov: 11729 ft: 12616 corp: 4/131b lim: 50 exec/s: 0 rss: 67Mb L: 37/48 MS: 1 InsertRepeatedBytes- 00:08:18.252 [2024-11-02 12:08:05.116512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17289301305884405743 len:61424 00:08:18.252 [2024-11-02 12:08:05.116540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.116591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17289301308300324847 len:61424 00:08:18.252 [2024-11-02 12:08:05.116607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.252 #12 NEW cov: 11814 ft: 13212 corp: 5/154b lim: 50 exec/s: 0 rss: 67Mb L: 23/48 MS: 5 ChangeBit-ChangeByte-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:18.252 [2024-11-02 12:08:05.156818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.252 [2024-11-02 12:08:05.156846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.156899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:5 00:08:18.252 [2024-11-02 12:08:05.156915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.156967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.156982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.157038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.157054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.252 #13 NEW cov: 11814 ft: 13375 corp: 6/199b lim: 50 exec/s: 0 rss: 67Mb L: 45/48 MS: 1 ChangeBit- 00:08:18.252 [2024-11-02 12:08:05.196964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.252 [2024-11-02 12:08:05.196997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.197060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.197076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.252 [2024-11-02 12:08:05.197123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.252 [2024-11-02 12:08:05.197138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.253 [2024-11-02 12:08:05.197189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.253 [2024-11-02 12:08:05.197203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.253 #14 NEW cov: 11814 ft: 13431 corp: 7/244b lim: 50 exec/s: 0 rss: 67Mb L: 45/48 MS: 1 CopyPart- 00:08:18.512 [2024-11-02 12:08:05.236975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.512 [2024-11-02 12:08:05.237008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.237065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1024 len:1 00:08:18.512 [2024-11-02 12:08:05.237081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.237130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.512 [2024-11-02 12:08:05.237145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.512 #15 NEW cov: 11814 ft: 13516 corp: 8/282b lim: 50 exec/s: 0 rss: 67Mb L: 38/48 MS: 1 EraseBytes- 00:08:18.512 [2024-11-02 12:08:05.277192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.512 [2024-11-02 12:08:05.277220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.277260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.512 [2024-11-02 12:08:05.277275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.277324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2522015791327477760 len:1 00:08:18.512 [2024-11-02 12:08:05.277339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.277404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.512 [2024-11-02 12:08:05.277420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.512 #16 NEW cov: 11814 ft: 13568 corp: 9/330b lim: 50 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 ChangeByte- 00:08:18.512 [2024-11-02 12:08:05.317322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724550144 len:1 00:08:18.512 [2024-11-02 12:08:05.317350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.317404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.512 [2024-11-02 12:08:05.317420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.317471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.512 [2024-11-02 12:08:05.317485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.512 [2024-11-02 12:08:05.317535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.512 [2024-11-02 12:08:05.317550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.512 #17 NEW cov: 11814 ft: 13630 corp: 10/375b lim: 50 exec/s: 0 rss: 67Mb L: 45/48 MS: 1 ChangeBit- 00:08:18.513 [2024-11-02 12:08:05.357438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.513 [2024-11-02 12:08:05.357466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.357504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2304 len:1 00:08:18.513 [2024-11-02 12:08:05.357520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.357571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2522015791327477760 len:1 00:08:18.513 [2024-11-02 12:08:05.357588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.357637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.513 [2024-11-02 12:08:05.357652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.513 #23 NEW cov: 11814 ft: 13685 corp: 11/423b lim: 50 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 ChangeBinInt- 00:08:18.513 [2024-11-02 12:08:05.397454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 00:08:18.513 [2024-11-02 12:08:05.397482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.397532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.513 [2024-11-02 12:08:05.397548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.397598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.513 [2024-11-02 12:08:05.397614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.513 #24 NEW cov: 11814 ft: 13707 corp: 12/460b lim: 50 exec/s: 0 rss: 68Mb L: 37/48 MS: 1 ChangeBit- 00:08:18.513 [2024-11-02 12:08:05.437681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.513 [2024-11-02 12:08:05.437709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.437745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.513 [2024-11-02 12:08:05.437760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.437807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.513 [2024-11-02 12:08:05.437822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.437870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:51791395714760704 len:1 00:08:18.513 [2024-11-02 12:08:05.437884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.513 #25 NEW cov: 11814 ft: 13786 corp: 13/508b lim: 50 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 ChangeByte- 00:08:18.513 [2024-11-02 12:08:05.477821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.513 [2024-11-02 12:08:05.477849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.477896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.513 [2024-11-02 12:08:05.477912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.477963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:102254581383168 len:1 00:08:18.513 [2024-11-02 12:08:05.477978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.513 [2024-11-02 12:08:05.478036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.513 [2024-11-02 12:08:05.478054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.772 #26 NEW cov: 11814 ft: 13805 corp: 14/553b lim: 50 exec/s: 0 rss: 68Mb L: 45/48 MS: 1 ChangeByte- 00:08:18.772 [2024-11-02 12:08:05.517787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.772 [2024-11-02 12:08:05.517816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.517864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:5 00:08:18.772 [2024-11-02 12:08:05.517880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.517931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:18.772 [2024-11-02 12:08:05.517946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.772 #27 NEW cov: 11814 ft: 13849 corp: 15/588b lim: 50 exec/s: 0 rss: 68Mb L: 35/48 MS: 1 EraseBytes- 00:08:18.772 [2024-11-02 12:08:05.558042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 00:08:18.772 [2024-11-02 12:08:05.558070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.558129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:94927367176192 len:22103 00:08:18.772 [2024-11-02 12:08:05.558144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.558194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6221254864074593878 len:1 00:08:18.772 [2024-11-02 12:08:05.558209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.558260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.772 [2024-11-02 12:08:05.558275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.772 #28 NEW cov: 11814 ft: 13915 corp: 16/637b lim: 50 exec/s: 0 rss: 69Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:08:18.772 [2024-11-02 12:08:05.598142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.772 [2024-11-02 12:08:05.598170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.598203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1025 00:08:18.772 [2024-11-02 12:08:05.598218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.598267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2522015791327477760 len:1 00:08:18.772 [2024-11-02 12:08:05.598282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.598330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.772 [2024-11-02 12:08:05.598345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.772 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:18.772 #34 NEW cov: 11837 ft: 13955 corp: 17/685b lim: 50 exec/s: 0 rss: 69Mb L: 48/49 MS: 1 ChangeBit- 00:08:18.772 [2024-11-02 12:08:05.638235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:18.772 [2024-11-02 12:08:05.638263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.638303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.772 [2024-11-02 12:08:05.638318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.638368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2424832 len:1 00:08:18.772 [2024-11-02 12:08:05.638383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.772 [2024-11-02 12:08:05.638435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.773 [2024-11-02 12:08:05.638448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.773 #35 NEW cov: 11837 ft: 13964 corp: 18/730b lim: 50 exec/s: 0 rss: 69Mb L: 45/49 MS: 1 ChangeByte- 00:08:18.773 [2024-11-02 12:08:05.668020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:111851470848 len:1 00:08:18.773 [2024-11-02 12:08:05.668057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.773 #39 NEW cov: 11837 ft: 14325 corp: 19/744b lim: 50 exec/s: 0 rss: 69Mb L: 14/49 MS: 4 CopyPart-CopyPart-ChangeByte-CrossOver- 00:08:18.773 [2024-11-02 12:08:05.708442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:62487444829765632 len:1 00:08:18.773 [2024-11-02 12:08:05.708469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.773 [2024-11-02 12:08:05.708509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:18.773 [2024-11-02 12:08:05.708524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.773 [2024-11-02 12:08:05.708574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:288230376151711744 len:1 00:08:18.773 [2024-11-02 12:08:05.708589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.773 [2024-11-02 12:08:05.708638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:18.773 [2024-11-02 12:08:05.708652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.773 #40 NEW cov: 11837 ft: 14352 corp: 20/784b lim: 50 exec/s: 40 rss: 69Mb L: 40/49 MS: 1 InsertRepeatedBytes- 00:08:19.032 [2024-11-02 12:08:05.748239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:988024799232 len:65281 00:08:19.032 [2024-11-02 12:08:05.748267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.032 #41 NEW cov: 11837 ft: 14378 corp: 21/798b lim: 50 exec/s: 41 rss: 69Mb L: 14/49 MS: 1 ChangeBinInt- 00:08:19.032 [2024-11-02 12:08:05.788344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:172023296 len:59136 00:08:19.032 [2024-11-02 12:08:05.788371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.032 #42 NEW cov: 11837 ft: 14391 corp: 22/813b lim: 50 exec/s: 42 rss: 69Mb L: 15/49 MS: 1 InsertByte- 00:08:19.032 [2024-11-02 12:08:05.828772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724565760 len:1 00:08:19.032 [2024-11-02 12:08:05.828802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.032 [2024-11-02 12:08:05.828836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:19.032 [2024-11-02 12:08:05.828852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.032 [2024-11-02 12:08:05.828902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:19.032 [2024-11-02 12:08:05.828917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.032 [2024-11-02 12:08:05.828982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:19.032 [2024-11-02 12:08:05.829000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.032 #43 NEW cov: 11837 ft: 14393 corp: 23/858b lim: 50 exec/s: 43 rss: 69Mb L: 45/49 MS: 1 ChangeByte- 00:08:19.032 [2024-11-02 12:08:05.868875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3377703445070080 len:1 00:08:19.032 [2024-11-02 12:08:05.868904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.032 [2024-11-02 12:08:05.868955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:19.032 [2024-11-02 12:08:05.868971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.032 [2024-11-02 12:08:05.869022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2424832 len:1 00:08:19.032 [2024-11-02 12:08:05.869039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.032 [2024-11-02 12:08:05.869092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:19.033 [2024-11-02 12:08:05.869107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.033 #44 NEW cov: 11837 ft: 14408 corp: 24/903b lim: 50 exec/s: 44 rss: 69Mb L: 45/49 MS: 1 CMP- DE: "\001\000\000\014"- 00:08:19.033 [2024-11-02 12:08:05.908707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069582422016 len:265 00:08:19.033 [2024-11-02 12:08:05.908734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.033 #49 NEW cov: 11837 ft: 14479 corp: 25/917b lim: 50 exec/s: 49 rss: 69Mb L: 14/49 MS: 5 ShuffleBytes-ShuffleBytes-InsertByte-PersAutoDict-CMP- DE: "\001\000\000\014"-"\377\377\377\377\001\010'\323"- 00:08:19.033 [2024-11-02 12:08:05.948821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:19.033 [2024-11-02 12:08:05.948848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.033 #51 NEW cov: 11837 ft: 14493 corp: 26/934b lim: 50 exec/s: 51 rss: 69Mb L: 17/49 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:19.033 [2024-11-02 12:08:05.979222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:19.033 [2024-11-02 12:08:05.979250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.033 [2024-11-02 12:08:05.979293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:5 00:08:19.033 [2024-11-02 12:08:05.979308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.033 [2024-11-02 12:08:05.979362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:19.033 [2024-11-02 12:08:05.979376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.033 [2024-11-02 12:08:05.979427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:127 00:08:19.033 [2024-11-02 12:08:05.979442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.033 #52 NEW cov: 11837 ft: 14529 corp: 27/979b lim: 50 exec/s: 52 rss: 69Mb L: 45/49 MS: 1 ChangeByte- 00:08:19.292 [2024-11-02 12:08:06.019018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7318349576798208 len:1 00:08:19.292 [2024-11-02 12:08:06.019045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.292 #53 NEW cov: 11837 ft: 14614 corp: 28/993b lim: 50 exec/s: 53 rss: 69Mb L: 14/49 MS: 1 ShuffleBytes- 00:08:19.292 [2024-11-02 12:08:06.059473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 00:08:19.292 [2024-11-02 12:08:06.059500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.292 [2024-11-02 12:08:06.059548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:94927367176192 len:11863 00:08:19.292 [2024-11-02 12:08:06.059564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.292 [2024-11-02 12:08:06.059611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6221254864074593878 len:1 00:08:19.292 [2024-11-02 12:08:06.059626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.292 [2024-11-02 12:08:06.059677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:19.292 [2024-11-02 12:08:06.059692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.292 #54 NEW cov: 11837 ft: 14655 corp: 29/1042b lim: 50 exec/s: 54 rss: 69Mb L: 49/49 MS: 1 ChangeByte- 00:08:19.292 [2024-11-02 12:08:06.099552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724550144 len:1 00:08:19.292 [2024-11-02 12:08:06.099580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.292 [2024-11-02 12:08:06.099625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:19.292 [2024-11-02 12:08:06.099639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.292 [2024-11-02 12:08:06.099691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:19.292 [2024-11-02 12:08:06.099722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.292 [2024-11-02 12:08:06.099774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:19.292 [2024-11-02 12:08:06.099789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.293 #55 NEW cov: 11837 ft: 14662 corp: 30/1089b lim: 50 exec/s: 55 rss: 69Mb L: 47/49 MS: 1 CopyPart- 00:08:19.293 [2024-11-02 12:08:06.139601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:19.293 [2024-11-02 12:08:06.139629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.139673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:5 00:08:19.293 [2024-11-02 12:08:06.139689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.139739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:19.293 [2024-11-02 12:08:06.139754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.293 #56 NEW cov: 11837 ft: 14670 corp: 31/1122b lim: 50 exec/s: 56 rss: 69Mb L: 33/49 MS: 1 EraseBytes- 00:08:19.293 [2024-11-02 12:08:06.179786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724565760 len:1 00:08:19.293 [2024-11-02 12:08:06.179814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.179856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:65536 00:08:19.293 [2024-11-02 12:08:06.179871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.179923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2869637391860039944 len:1 00:08:19.293 [2024-11-02 12:08:06.179938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.179988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:19.293 [2024-11-02 12:08:06.180009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.293 #57 NEW cov: 11837 ft: 14685 corp: 32/1167b lim: 50 exec/s: 57 rss: 70Mb L: 45/49 MS: 1 PersAutoDict- DE: "\377\377\377\377\001\010'\323"- 00:08:19.293 [2024-11-02 12:08:06.219580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15996785876602322954 len:1 00:08:19.293 [2024-11-02 12:08:06.219608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.293 #58 NEW cov: 11837 ft: 14695 corp: 33/1181b lim: 50 exec/s: 58 rss: 70Mb L: 14/49 MS: 1 CopyPart- 00:08:19.293 [2024-11-02 12:08:06.260056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 00:08:19.293 [2024-11-02 12:08:06.260083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.260139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:94927367176192 len:22103 00:08:19.293 [2024-11-02 12:08:06.260166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.260213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6221254864074593878 len:1 00:08:19.293 [2024-11-02 12:08:06.260228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.293 [2024-11-02 12:08:06.260277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12032 len:1 00:08:19.293 [2024-11-02 12:08:06.260292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.552 #59 NEW cov: 11837 ft: 14707 corp: 34/1230b lim: 50 exec/s: 59 rss: 70Mb L: 49/49 MS: 1 ChangeByte- 00:08:19.552 [2024-11-02 12:08:06.300265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:19.552 [2024-11-02 12:08:06.300295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.300331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:19.552 [2024-11-02 12:08:06.300345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.300395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:19.552 [2024-11-02 12:08:06.300410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.300459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:790273982464 len:1 00:08:19.552 [2024-11-02 12:08:06.300473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.300539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:72057589742960640 len:1 00:08:19.552 [2024-11-02 12:08:06.300554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.552 #60 NEW cov: 11837 ft: 14779 corp: 35/1280b lim: 50 exec/s: 60 rss: 70Mb L: 50/50 MS: 1 CrossOver- 00:08:19.552 [2024-11-02 12:08:06.339941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:111851470858 len:1 00:08:19.552 [2024-11-02 12:08:06.339968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.552 #61 NEW cov: 11837 ft: 14804 corp: 36/1291b lim: 50 exec/s: 61 rss: 70Mb L: 11/50 MS: 1 EraseBytes- 00:08:19.552 [2024-11-02 12:08:06.380395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:19.552 [2024-11-02 12:08:06.380423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.380478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9 len:1 00:08:19.552 [2024-11-02 12:08:06.380494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.380547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:9851624184872960 len:1 00:08:19.552 [2024-11-02 12:08:06.380562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.380611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:19.552 [2024-11-02 12:08:06.380626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.552 #62 NEW cov: 11837 ft: 14806 corp: 37/1340b lim: 50 exec/s: 62 rss: 70Mb L: 49/50 MS: 1 CrossOver- 00:08:19.552 [2024-11-02 12:08:06.420625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724565760 len:1 00:08:19.552 [2024-11-02 12:08:06.420652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.552 [2024-11-02 12:08:06.420704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:65281 00:08:19.553 [2024-11-02 12:08:06.420719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.553 [2024-11-02 12:08:06.420782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446742978492891136 len:2088 00:08:19.553 [2024-11-02 12:08:06.420797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.553 [2024-11-02 12:08:06.420848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3539992576 len:1 00:08:19.553 [2024-11-02 12:08:06.420864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.553 [2024-11-02 12:08:06.420914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:0 len:1 00:08:19.553 [2024-11-02 12:08:06.420929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.553 #63 NEW cov: 11837 ft: 14825 corp: 38/1390b lim: 50 exec/s: 63 rss: 70Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:19.553 [2024-11-02 12:08:06.460585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2315255808 len:1 00:08:19.553 [2024-11-02 12:08:06.460612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.553 [2024-11-02 12:08:06.460665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:94927367176192 len:11863 00:08:19.553 [2024-11-02 12:08:06.460682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.553 [2024-11-02 12:08:06.460730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6221254864074593878 len:1 00:08:19.553 [2024-11-02 12:08:06.460746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.553 [2024-11-02 12:08:06.460795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069414649855 len:65343 00:08:19.553 [2024-11-02 12:08:06.460810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.553 #64 NEW cov: 11837 ft: 14837 corp: 39/1439b lim: 50 exec/s: 64 rss: 70Mb L: 49/50 MS: 1 CMP- DE: "\377\377\377\377\377\377\377>"- 00:08:19.553 [2024-11-02 12:08:06.500350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13237271527425 len:56833 00:08:19.553 [2024-11-02 12:08:06.500377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.553 #65 NEW cov: 11837 ft: 14856 corp: 40/1457b lim: 50 exec/s: 65 rss: 70Mb L: 18/50 MS: 1 PersAutoDict- DE: "\001\000\000\014"- 00:08:19.812 [2024-11-02 12:08:06.540701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:19.812 [2024-11-02 12:08:06.540730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.540784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:19.813 [2024-11-02 12:08:06.540800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.540852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:67108864 len:1 00:08:19.813 [2024-11-02 12:08:06.540868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.813 #66 NEW cov: 11837 ft: 14873 corp: 41/1491b lim: 50 exec/s: 66 rss: 70Mb L: 34/50 MS: 1 CopyPart- 00:08:19.813 [2024-11-02 12:08:06.580928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2315255808 len:15617 00:08:19.813 [2024-11-02 12:08:06.580958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.581002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:94927367176192 len:22103 00:08:19.813 [2024-11-02 12:08:06.581021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.581069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6221254864074593878 len:1 00:08:19.813 [2024-11-02 12:08:06.581085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.581136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12032 len:1 00:08:19.813 [2024-11-02 12:08:06.581151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.813 #67 NEW cov: 11837 ft: 14877 corp: 42/1540b lim: 50 exec/s: 67 rss: 70Mb L: 49/50 MS: 1 ChangeByte- 00:08:19.813 [2024-11-02 12:08:06.620925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:19.813 [2024-11-02 12:08:06.620954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.621007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1125899906842624 len:1 00:08:19.813 [2024-11-02 12:08:06.621023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.621077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:67108864 len:1 00:08:19.813 [2024-11-02 12:08:06.621092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.813 #68 NEW cov: 11837 ft: 14890 corp: 43/1574b lim: 50 exec/s: 68 rss: 70Mb L: 34/50 MS: 1 ChangeBit- 00:08:19.813 [2024-11-02 12:08:06.660874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374976922540048182 len:54228 00:08:19.813 [2024-11-02 12:08:06.660901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.813 #76 NEW cov: 11837 ft: 14910 corp: 44/1584b lim: 50 exec/s: 76 rss: 70Mb L: 10/50 MS: 3 PersAutoDict-PersAutoDict-InsertByte- DE: "\377\377\377\377\001\010'\323"-"\377\377\377\377\001\010'\323"- 00:08:19.813 [2024-11-02 12:08:06.701304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3724541952 len:1 00:08:19.813 [2024-11-02 12:08:06.701332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.701366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2304 len:1 00:08:19.813 [2024-11-02 12:08:06.701382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.701433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2522015791327477760 len:1 00:08:19.813 [2024-11-02 12:08:06.701449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.813 [2024-11-02 12:08:06.701499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:2561 00:08:19.813 [2024-11-02 12:08:06.701513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.813 #77 NEW cov: 11837 ft: 14927 corp: 45/1632b lim: 50 exec/s: 38 rss: 70Mb L: 48/50 MS: 1 ChangeBinInt- 00:08:19.813 #77 DONE cov: 11837 ft: 14927 corp: 45/1632b lim: 50 exec/s: 38 rss: 70Mb 00:08:19.813 ###### Recommended dictionary. ###### 00:08:19.813 "\001\000\000\014" # Uses: 2 00:08:19.813 "\377\377\377\377\001\010'\323" # Uses: 3 00:08:19.813 "\377\377\377\377\377\377\377>" # Uses: 0 00:08:19.813 ###### End of recommended dictionary. ###### 00:08:19.813 Done 77 runs in 2 second(s) 00:08:20.073 12:08:06 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:20.073 12:08:06 -- ../common.sh@72 -- # (( i++ )) 00:08:20.073 12:08:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.073 12:08:06 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:20.073 12:08:06 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:20.073 12:08:06 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.073 12:08:06 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.073 12:08:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:20.073 12:08:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:20.073 12:08:06 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:20.073 12:08:06 -- nvmf/run.sh@29 -- # port=4420 00:08:20.073 12:08:06 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:20.073 12:08:06 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:20.073 12:08:06 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.073 12:08:06 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:20.073 [2024-11-02 12:08:06.875035] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:20.073 [2024-11-02 12:08:06.875105] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1152894 ] 00:08:20.073 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.332 [2024-11-02 12:08:07.133321] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.332 [2024-11-02 12:08:07.159721] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.332 [2024-11-02 12:08:07.159846] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.332 [2024-11-02 12:08:07.211115] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.332 [2024-11-02 12:08:07.227478] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:20.332 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.332 INFO: Seed: 1002035133 00:08:20.332 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:20.332 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:20.332 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:20.332 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.332 #2 INITED exec/s: 0 rss: 59Mb 00:08:20.332 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.332 This may also happen if the target rejected all inputs we tried so far 00:08:20.332 [2024-11-02 12:08:07.282781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.332 [2024-11-02 12:08:07.282813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.332 [2024-11-02 12:08:07.282852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.332 [2024-11-02 12:08:07.282868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.332 [2024-11-02 12:08:07.282921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.332 [2024-11-02 12:08:07.282935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.591 NEW_FUNC[1/671]: 0x473598 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:20.591 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:20.591 #20 NEW cov: 11666 ft: 11669 corp: 2/67b lim: 90 exec/s: 0 rss: 67Mb L: 66/66 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:20.851 [2024-11-02 12:08:07.573610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.851 [2024-11-02 12:08:07.573648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.573714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.851 [2024-11-02 12:08:07.573734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.573794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.851 [2024-11-02 12:08:07.573814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.851 NEW_FUNC[1/1]: 0x1c72478 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:795 00:08:20.851 #26 NEW cov: 11781 ft: 12136 corp: 3/133b lim: 90 exec/s: 0 rss: 67Mb L: 66/66 MS: 1 ShuffleBytes- 00:08:20.851 [2024-11-02 12:08:07.623673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.851 [2024-11-02 12:08:07.623701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.623752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.851 [2024-11-02 12:08:07.623768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.623823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.851 [2024-11-02 12:08:07.623840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.851 #27 NEW cov: 11787 ft: 12411 corp: 4/199b lim: 90 exec/s: 0 rss: 67Mb L: 66/66 MS: 1 ChangeByte- 00:08:20.851 [2024-11-02 12:08:07.663944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.851 [2024-11-02 12:08:07.663972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.664027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.851 [2024-11-02 12:08:07.664043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.664097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.851 [2024-11-02 12:08:07.664112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.664169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:20.851 [2024-11-02 12:08:07.664184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.851 #28 NEW cov: 11872 ft: 13001 corp: 5/287b lim: 90 exec/s: 0 rss: 67Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:20.851 [2024-11-02 12:08:07.714079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.851 [2024-11-02 12:08:07.714106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.714148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.851 [2024-11-02 12:08:07.714163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.714219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.851 [2024-11-02 12:08:07.714233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.714286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:20.851 [2024-11-02 12:08:07.714301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.851 #29 NEW cov: 11872 ft: 13125 corp: 6/369b lim: 90 exec/s: 0 rss: 67Mb L: 82/88 MS: 1 CopyPart- 00:08:20.851 [2024-11-02 12:08:07.754188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.851 [2024-11-02 12:08:07.754215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.754264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.851 [2024-11-02 12:08:07.754280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.754334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.851 [2024-11-02 12:08:07.754348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.754400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:20.851 [2024-11-02 12:08:07.754415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.851 #30 NEW cov: 11872 ft: 13194 corp: 7/454b lim: 90 exec/s: 0 rss: 67Mb L: 85/88 MS: 1 InsertRepeatedBytes- 00:08:20.851 [2024-11-02 12:08:07.794166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:20.851 [2024-11-02 12:08:07.794194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.794231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:20.851 [2024-11-02 12:08:07.794247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.851 [2024-11-02 12:08:07.794302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:20.851 [2024-11-02 12:08:07.794316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.851 #31 NEW cov: 11872 ft: 13283 corp: 8/520b lim: 90 exec/s: 0 rss: 67Mb L: 66/88 MS: 1 ChangeBit- 00:08:21.111 [2024-11-02 12:08:07.834242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.111 [2024-11-02 12:08:07.834269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.111 [2024-11-02 12:08:07.834308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.111 [2024-11-02 12:08:07.834323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.111 [2024-11-02 12:08:07.834379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.111 [2024-11-02 12:08:07.834394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.111 #32 NEW cov: 11872 ft: 13326 corp: 9/586b lim: 90 exec/s: 0 rss: 67Mb L: 66/88 MS: 1 ChangeBinInt- 00:08:21.111 [2024-11-02 12:08:07.874588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.111 [2024-11-02 12:08:07.874615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.111 [2024-11-02 12:08:07.874670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.111 [2024-11-02 12:08:07.874686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.111 [2024-11-02 12:08:07.874740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.111 [2024-11-02 12:08:07.874756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.111 [2024-11-02 12:08:07.874810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.111 [2024-11-02 12:08:07.874826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.111 #33 NEW cov: 11872 ft: 13331 corp: 10/670b lim: 90 exec/s: 0 rss: 67Mb L: 84/88 MS: 1 InsertRepeatedBytes- 00:08:21.111 [2024-11-02 12:08:07.914755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.111 [2024-11-02 12:08:07.914782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.111 [2024-11-02 12:08:07.914835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.111 [2024-11-02 12:08:07.914850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.914903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.112 [2024-11-02 12:08:07.914918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.914970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.112 [2024-11-02 12:08:07.914986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.915046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:21.112 [2024-11-02 12:08:07.915061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.112 #34 NEW cov: 11872 ft: 13446 corp: 11/760b lim: 90 exec/s: 0 rss: 67Mb L: 90/90 MS: 1 CrossOver- 00:08:21.112 [2024-11-02 12:08:07.954736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.112 [2024-11-02 12:08:07.954763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.954814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.112 [2024-11-02 12:08:07.954830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.954886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.112 [2024-11-02 12:08:07.954899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.954953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.112 [2024-11-02 12:08:07.954968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.112 #35 NEW cov: 11872 ft: 13552 corp: 12/848b lim: 90 exec/s: 0 rss: 67Mb L: 88/90 MS: 1 ChangeBinInt- 00:08:21.112 [2024-11-02 12:08:07.994889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.112 [2024-11-02 12:08:07.994916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.994967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.112 [2024-11-02 12:08:07.994983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.995043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.112 [2024-11-02 12:08:07.995060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:07.995113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.112 [2024-11-02 12:08:07.995127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.112 #36 NEW cov: 11872 ft: 13621 corp: 13/936b lim: 90 exec/s: 0 rss: 68Mb L: 88/90 MS: 1 InsertRepeatedBytes- 00:08:21.112 [2024-11-02 12:08:08.034859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.112 [2024-11-02 12:08:08.034886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:08.034931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.112 [2024-11-02 12:08:08.034946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:08.035007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.112 [2024-11-02 12:08:08.035023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.112 #37 NEW cov: 11872 ft: 13672 corp: 14/1002b lim: 90 exec/s: 0 rss: 68Mb L: 66/90 MS: 1 ChangeBit- 00:08:21.112 [2024-11-02 12:08:08.074943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.112 [2024-11-02 12:08:08.074969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:08.075011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.112 [2024-11-02 12:08:08.075027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.112 [2024-11-02 12:08:08.075082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.112 [2024-11-02 12:08:08.075097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.371 #38 NEW cov: 11872 ft: 13680 corp: 15/1068b lim: 90 exec/s: 0 rss: 68Mb L: 66/90 MS: 1 ShuffleBytes- 00:08:21.371 [2024-11-02 12:08:08.115258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.371 [2024-11-02 12:08:08.115286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.371 [2024-11-02 12:08:08.115330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.371 [2024-11-02 12:08:08.115345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.371 [2024-11-02 12:08:08.115400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.372 [2024-11-02 12:08:08.115419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.115472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.372 [2024-11-02 12:08:08.115487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.372 #39 NEW cov: 11872 ft: 13691 corp: 16/1152b lim: 90 exec/s: 0 rss: 68Mb L: 84/90 MS: 1 CopyPart- 00:08:21.372 [2024-11-02 12:08:08.155215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.372 [2024-11-02 12:08:08.155243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.155281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.372 [2024-11-02 12:08:08.155297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.155352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.372 [2024-11-02 12:08:08.155368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.372 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.372 #40 NEW cov: 11895 ft: 13776 corp: 17/1218b lim: 90 exec/s: 0 rss: 68Mb L: 66/90 MS: 1 ChangeBinInt- 00:08:21.372 [2024-11-02 12:08:08.195145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.372 [2024-11-02 12:08:08.195174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.195225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.372 [2024-11-02 12:08:08.195240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.372 #41 NEW cov: 11895 ft: 14120 corp: 18/1262b lim: 90 exec/s: 0 rss: 68Mb L: 44/90 MS: 1 EraseBytes- 00:08:21.372 [2024-11-02 12:08:08.235532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.372 [2024-11-02 12:08:08.235561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.235622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.372 [2024-11-02 12:08:08.235639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.235694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.372 [2024-11-02 12:08:08.235708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.235762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.372 [2024-11-02 12:08:08.235778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.372 #42 NEW cov: 11895 ft: 14132 corp: 19/1347b lim: 90 exec/s: 0 rss: 68Mb L: 85/90 MS: 1 InsertByte- 00:08:21.372 [2024-11-02 12:08:08.275665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.372 [2024-11-02 12:08:08.275694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.275736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.372 [2024-11-02 12:08:08.275755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.275810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.372 [2024-11-02 12:08:08.275825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.275877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.372 [2024-11-02 12:08:08.275893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.372 #43 NEW cov: 11895 ft: 14223 corp: 20/1431b lim: 90 exec/s: 43 rss: 68Mb L: 84/90 MS: 1 CopyPart- 00:08:21.372 [2024-11-02 12:08:08.315650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.372 [2024-11-02 12:08:08.315678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.315717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.372 [2024-11-02 12:08:08.315731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.372 [2024-11-02 12:08:08.315786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.372 [2024-11-02 12:08:08.315801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.372 #44 NEW cov: 11895 ft: 14260 corp: 21/1494b lim: 90 exec/s: 44 rss: 68Mb L: 63/90 MS: 1 EraseBytes- 00:08:21.631 [2024-11-02 12:08:08.355903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.632 [2024-11-02 12:08:08.355931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.355978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.632 [2024-11-02 12:08:08.355998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.356073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.632 [2024-11-02 12:08:08.356089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.356146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.632 [2024-11-02 12:08:08.356161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.632 #45 NEW cov: 11895 ft: 14285 corp: 22/1581b lim: 90 exec/s: 45 rss: 68Mb L: 87/90 MS: 1 InsertRepeatedBytes- 00:08:21.632 [2024-11-02 12:08:08.396027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.632 [2024-11-02 12:08:08.396055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.396123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.632 [2024-11-02 12:08:08.396139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.396192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.632 [2024-11-02 12:08:08.396206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.396263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.632 [2024-11-02 12:08:08.396278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.632 #46 NEW cov: 11895 ft: 14347 corp: 23/1669b lim: 90 exec/s: 46 rss: 68Mb L: 88/90 MS: 1 ChangeASCIIInt- 00:08:21.632 [2024-11-02 12:08:08.436161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.632 [2024-11-02 12:08:08.436190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.436234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.632 [2024-11-02 12:08:08.436250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.436301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.632 [2024-11-02 12:08:08.436315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.436368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.632 [2024-11-02 12:08:08.436383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.632 #47 NEW cov: 11895 ft: 14361 corp: 24/1757b lim: 90 exec/s: 47 rss: 68Mb L: 88/90 MS: 1 InsertRepeatedBytes- 00:08:21.632 [2024-11-02 12:08:08.476228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.632 [2024-11-02 12:08:08.476256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.476301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.632 [2024-11-02 12:08:08.476316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.476371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.632 [2024-11-02 12:08:08.476387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.476444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.632 [2024-11-02 12:08:08.476459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.632 #48 NEW cov: 11895 ft: 14371 corp: 25/1841b lim: 90 exec/s: 48 rss: 68Mb L: 84/90 MS: 1 ChangeBinInt- 00:08:21.632 [2024-11-02 12:08:08.516463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.632 [2024-11-02 12:08:08.516490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.516540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.632 [2024-11-02 12:08:08.516556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.516611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.632 [2024-11-02 12:08:08.516626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.516682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.632 [2024-11-02 12:08:08.516697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.516755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:21.632 [2024-11-02 12:08:08.516770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.632 #49 NEW cov: 11895 ft: 14407 corp: 26/1931b lim: 90 exec/s: 49 rss: 68Mb L: 90/90 MS: 1 CopyPart- 00:08:21.632 [2024-11-02 12:08:08.556568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.632 [2024-11-02 12:08:08.556596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.556647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.632 [2024-11-02 12:08:08.556663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.556717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.632 [2024-11-02 12:08:08.556732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.556784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.632 [2024-11-02 12:08:08.556798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.556854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:21.632 [2024-11-02 12:08:08.556867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.632 #50 NEW cov: 11895 ft: 14424 corp: 27/2021b lim: 90 exec/s: 50 rss: 68Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:21.632 [2024-11-02 12:08:08.596546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.632 [2024-11-02 12:08:08.596573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.596640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.632 [2024-11-02 12:08:08.596656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.596710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.632 [2024-11-02 12:08:08.596726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.632 [2024-11-02 12:08:08.596782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.632 [2024-11-02 12:08:08.596797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.892 #51 NEW cov: 11895 ft: 14433 corp: 28/2097b lim: 90 exec/s: 51 rss: 68Mb L: 76/90 MS: 1 CrossOver- 00:08:21.892 [2024-11-02 12:08:08.636711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.892 [2024-11-02 12:08:08.636738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.636777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.892 [2024-11-02 12:08:08.636792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.636848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.892 [2024-11-02 12:08:08.636863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.636921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.892 [2024-11-02 12:08:08.636936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.892 #52 NEW cov: 11895 ft: 14473 corp: 29/2184b lim: 90 exec/s: 52 rss: 68Mb L: 87/90 MS: 1 CrossOver- 00:08:21.892 [2024-11-02 12:08:08.676658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.892 [2024-11-02 12:08:08.676684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.676736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.892 [2024-11-02 12:08:08.676752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.676807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.892 [2024-11-02 12:08:08.676821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.892 #53 NEW cov: 11895 ft: 14492 corp: 30/2247b lim: 90 exec/s: 53 rss: 69Mb L: 63/90 MS: 1 EraseBytes- 00:08:21.892 [2024-11-02 12:08:08.716922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.892 [2024-11-02 12:08:08.716949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.717021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.892 [2024-11-02 12:08:08.717038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.717093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.892 [2024-11-02 12:08:08.717107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.717163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.892 [2024-11-02 12:08:08.717178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.892 #54 NEW cov: 11895 ft: 14503 corp: 31/2331b lim: 90 exec/s: 54 rss: 69Mb L: 84/90 MS: 1 ChangeBinInt- 00:08:21.892 [2024-11-02 12:08:08.757193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.892 [2024-11-02 12:08:08.757219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.757272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.892 [2024-11-02 12:08:08.757287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.757342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.892 [2024-11-02 12:08:08.757357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.757410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.892 [2024-11-02 12:08:08.757424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.757477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:21.892 [2024-11-02 12:08:08.757496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.892 #55 NEW cov: 11895 ft: 14532 corp: 32/2421b lim: 90 exec/s: 55 rss: 69Mb L: 90/90 MS: 1 ChangeBit- 00:08:21.892 [2024-11-02 12:08:08.797220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.892 [2024-11-02 12:08:08.797247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.797314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.892 [2024-11-02 12:08:08.797330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.797385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.892 [2024-11-02 12:08:08.797401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.797454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.892 [2024-11-02 12:08:08.797470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.892 #56 NEW cov: 11895 ft: 14560 corp: 33/2509b lim: 90 exec/s: 56 rss: 69Mb L: 88/90 MS: 1 ChangeBinInt- 00:08:21.892 [2024-11-02 12:08:08.837412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:21.892 [2024-11-02 12:08:08.837439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.837493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:21.892 [2024-11-02 12:08:08.837509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.837561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:21.892 [2024-11-02 12:08:08.837575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.837626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:21.892 [2024-11-02 12:08:08.837641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.892 [2024-11-02 12:08:08.837697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:21.892 [2024-11-02 12:08:08.837710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.892 #57 NEW cov: 11895 ft: 14605 corp: 34/2599b lim: 90 exec/s: 57 rss: 69Mb L: 90/90 MS: 1 CrossOver- 00:08:22.152 [2024-11-02 12:08:08.877600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.152 [2024-11-02 12:08:08.877627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.877679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.152 [2024-11-02 12:08:08.877695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.877749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.152 [2024-11-02 12:08:08.877764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.877815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.152 [2024-11-02 12:08:08.877833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.877887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:22.152 [2024-11-02 12:08:08.877903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:22.152 #58 NEW cov: 11895 ft: 14609 corp: 35/2689b lim: 90 exec/s: 58 rss: 69Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:22.152 [2024-11-02 12:08:08.917637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.152 [2024-11-02 12:08:08.917664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.917712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.152 [2024-11-02 12:08:08.917728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.917781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.152 [2024-11-02 12:08:08.917797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.917867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.152 [2024-11-02 12:08:08.917883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.917937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:22.152 [2024-11-02 12:08:08.917953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:22.152 #59 NEW cov: 11895 ft: 14615 corp: 36/2779b lim: 90 exec/s: 59 rss: 69Mb L: 90/90 MS: 1 ChangeBit- 00:08:22.152 [2024-11-02 12:08:08.957623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.152 [2024-11-02 12:08:08.957650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.957695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.152 [2024-11-02 12:08:08.957711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.957763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.152 [2024-11-02 12:08:08.957793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.957849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.152 [2024-11-02 12:08:08.957865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.152 #60 NEW cov: 11895 ft: 14618 corp: 37/2867b lim: 90 exec/s: 60 rss: 69Mb L: 88/90 MS: 1 ChangeBit- 00:08:22.152 [2024-11-02 12:08:08.997909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.152 [2024-11-02 12:08:08.997935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.997989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.152 [2024-11-02 12:08:08.998007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.998068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.152 [2024-11-02 12:08:08.998087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.998143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.152 [2024-11-02 12:08:08.998158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:08.998216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:22.152 [2024-11-02 12:08:08.998233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:22.152 #61 NEW cov: 11895 ft: 14629 corp: 38/2957b lim: 90 exec/s: 61 rss: 69Mb L: 90/90 MS: 1 CMP- DE: "\001\000\000\012"- 00:08:22.152 [2024-11-02 12:08:09.037850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.152 [2024-11-02 12:08:09.037878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:09.037927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.152 [2024-11-02 12:08:09.037944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:09.038000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.152 [2024-11-02 12:08:09.038016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.152 [2024-11-02 12:08:09.038073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.152 [2024-11-02 12:08:09.038089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.153 #62 NEW cov: 11895 ft: 14635 corp: 39/3045b lim: 90 exec/s: 62 rss: 69Mb L: 88/90 MS: 1 CrossOver- 00:08:22.153 [2024-11-02 12:08:09.077837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.153 [2024-11-02 12:08:09.077864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.153 [2024-11-02 12:08:09.077907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.153 [2024-11-02 12:08:09.077923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.153 [2024-11-02 12:08:09.077975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.153 [2024-11-02 12:08:09.077990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.153 #63 NEW cov: 11895 ft: 14679 corp: 40/3111b lim: 90 exec/s: 63 rss: 69Mb L: 66/90 MS: 1 ChangeBit- 00:08:22.153 [2024-11-02 12:08:09.118130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.153 [2024-11-02 12:08:09.118156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.153 [2024-11-02 12:08:09.118223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.153 [2024-11-02 12:08:09.118240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.153 [2024-11-02 12:08:09.118298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.153 [2024-11-02 12:08:09.118312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.153 [2024-11-02 12:08:09.118370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.153 [2024-11-02 12:08:09.118385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.413 #64 NEW cov: 11895 ft: 14683 corp: 41/3184b lim: 90 exec/s: 64 rss: 69Mb L: 73/90 MS: 1 InsertRepeatedBytes- 00:08:22.413 [2024-11-02 12:08:09.158189] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.413 [2024-11-02 12:08:09.158216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.158258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.413 [2024-11-02 12:08:09.158271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.158323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.413 [2024-11-02 12:08:09.158338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.158392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.413 [2024-11-02 12:08:09.158407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.413 #65 NEW cov: 11895 ft: 14686 corp: 42/3269b lim: 90 exec/s: 65 rss: 69Mb L: 85/90 MS: 1 ShuffleBytes- 00:08:22.413 [2024-11-02 12:08:09.198516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.413 [2024-11-02 12:08:09.198544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.198603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.413 [2024-11-02 12:08:09.198617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.198673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.413 [2024-11-02 12:08:09.198689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.198742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.413 [2024-11-02 12:08:09.198758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.198815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:22.413 [2024-11-02 12:08:09.198830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:22.413 #66 NEW cov: 11895 ft: 14700 corp: 43/3359b lim: 90 exec/s: 66 rss: 69Mb L: 90/90 MS: 1 CopyPart- 00:08:22.413 [2024-11-02 12:08:09.238446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.413 [2024-11-02 12:08:09.238472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.238539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.413 [2024-11-02 12:08:09.238555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.238610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.413 [2024-11-02 12:08:09.238624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.413 [2024-11-02 12:08:09.238682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:22.413 [2024-11-02 12:08:09.238698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.413 #67 NEW cov: 11895 ft: 14701 corp: 44/3447b lim: 90 exec/s: 33 rss: 69Mb L: 88/90 MS: 1 CrossOver- 00:08:22.413 #67 DONE cov: 11895 ft: 14701 corp: 44/3447b lim: 90 exec/s: 33 rss: 69Mb 00:08:22.413 ###### Recommended dictionary. ###### 00:08:22.413 "\001\000\000\012" # Uses: 0 00:08:22.413 ###### End of recommended dictionary. ###### 00:08:22.413 Done 67 runs in 2 second(s) 00:08:22.413 12:08:09 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:22.413 12:08:09 -- ../common.sh@72 -- # (( i++ )) 00:08:22.413 12:08:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.413 12:08:09 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:22.413 12:08:09 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:22.413 12:08:09 -- nvmf/run.sh@24 -- # local timen=1 00:08:22.413 12:08:09 -- nvmf/run.sh@25 -- # local core=0x1 00:08:22.413 12:08:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:22.413 12:08:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:22.413 12:08:09 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:22.413 12:08:09 -- nvmf/run.sh@29 -- # port=4421 00:08:22.413 12:08:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:22.672 12:08:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:22.672 12:08:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:22.673 12:08:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:22.673 [2024-11-02 12:08:09.420858] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:22.673 [2024-11-02 12:08:09.420937] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1153320 ] 00:08:22.673 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.673 [2024-11-02 12:08:09.597576] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.673 [2024-11-02 12:08:09.617190] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.673 [2024-11-02 12:08:09.617324] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.931 [2024-11-02 12:08:09.668657] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:22.931 [2024-11-02 12:08:09.685033] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:22.931 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.931 INFO: Seed: 3458044254 00:08:22.931 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:22.931 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:22.931 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:22.931 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.931 #2 INITED exec/s: 0 rss: 59Mb 00:08:22.931 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.931 This may also happen if the target rejected all inputs we tried so far 00:08:22.931 [2024-11-02 12:08:09.753578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:22.931 [2024-11-02 12:08:09.753614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.931 [2024-11-02 12:08:09.753756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:22.931 [2024-11-02 12:08:09.753782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.931 [2024-11-02 12:08:09.753913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:22.931 [2024-11-02 12:08:09.753936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.190 NEW_FUNC[1/672]: 0x4767c8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:23.190 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.190 #7 NEW cov: 11643 ft: 11631 corp: 2/36b lim: 50 exec/s: 0 rss: 67Mb L: 35/35 MS: 5 InsertByte-EraseBytes-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:23.190 [2024-11-02 12:08:10.084419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.190 [2024-11-02 12:08:10.084477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.190 [2024-11-02 12:08:10.084611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.191 [2024-11-02 12:08:10.084642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.191 [2024-11-02 12:08:10.084778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.191 [2024-11-02 12:08:10.084808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.191 #8 NEW cov: 11756 ft: 12183 corp: 3/66b lim: 50 exec/s: 0 rss: 67Mb L: 30/35 MS: 1 CrossOver- 00:08:23.191 [2024-11-02 12:08:10.124388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.191 [2024-11-02 12:08:10.124423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.191 [2024-11-02 12:08:10.124543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.191 [2024-11-02 12:08:10.124565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.191 [2024-11-02 12:08:10.124683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.191 [2024-11-02 12:08:10.124706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.191 #9 NEW cov: 11762 ft: 12353 corp: 4/101b lim: 50 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 ChangeBit- 00:08:23.450 [2024-11-02 12:08:10.174596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.450 [2024-11-02 12:08:10.174629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.450 [2024-11-02 12:08:10.174707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.450 [2024-11-02 12:08:10.174727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.450 [2024-11-02 12:08:10.174843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.450 [2024-11-02 12:08:10.174863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.450 #10 NEW cov: 11847 ft: 12565 corp: 5/133b lim: 50 exec/s: 0 rss: 67Mb L: 32/35 MS: 1 CrossOver- 00:08:23.450 [2024-11-02 12:08:10.214649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.450 [2024-11-02 12:08:10.214684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.450 [2024-11-02 12:08:10.214813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.450 [2024-11-02 12:08:10.214836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.450 [2024-11-02 12:08:10.214957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.450 [2024-11-02 12:08:10.214977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.450 #11 NEW cov: 11847 ft: 12772 corp: 6/171b lim: 50 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:23.450 [2024-11-02 12:08:10.254754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.450 [2024-11-02 12:08:10.254785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.450 [2024-11-02 12:08:10.254890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.450 [2024-11-02 12:08:10.254913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.450 [2024-11-02 12:08:10.255032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.450 [2024-11-02 12:08:10.255054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.450 #12 NEW cov: 11847 ft: 12836 corp: 7/207b lim: 50 exec/s: 0 rss: 67Mb L: 36/38 MS: 1 InsertByte- 00:08:23.450 [2024-11-02 12:08:10.295110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.450 [2024-11-02 12:08:10.295139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.295259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.451 [2024-11-02 12:08:10.295281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.295397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.451 [2024-11-02 12:08:10.295415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.295538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:23.451 [2024-11-02 12:08:10.295559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.451 #13 NEW cov: 11847 ft: 13221 corp: 8/252b lim: 50 exec/s: 0 rss: 67Mb L: 45/45 MS: 1 CrossOver- 00:08:23.451 [2024-11-02 12:08:10.345129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.451 [2024-11-02 12:08:10.345159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.345262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.451 [2024-11-02 12:08:10.345278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.345391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.451 [2024-11-02 12:08:10.345411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.451 #14 NEW cov: 11847 ft: 13314 corp: 9/284b lim: 50 exec/s: 0 rss: 67Mb L: 32/45 MS: 1 ChangeByte- 00:08:23.451 [2024-11-02 12:08:10.385082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.451 [2024-11-02 12:08:10.385114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.385198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.451 [2024-11-02 12:08:10.385218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.385336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.451 [2024-11-02 12:08:10.385358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.451 #15 NEW cov: 11847 ft: 13398 corp: 10/316b lim: 50 exec/s: 0 rss: 68Mb L: 32/45 MS: 1 ChangeBit- 00:08:23.451 [2024-11-02 12:08:10.425374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.451 [2024-11-02 12:08:10.425407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.425514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.451 [2024-11-02 12:08:10.425536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.451 [2024-11-02 12:08:10.425658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.451 [2024-11-02 12:08:10.425676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.710 #16 NEW cov: 11847 ft: 13421 corp: 11/351b lim: 50 exec/s: 0 rss: 68Mb L: 35/45 MS: 1 ChangeBit- 00:08:23.710 [2024-11-02 12:08:10.465377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.710 [2024-11-02 12:08:10.465409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.710 [2024-11-02 12:08:10.465526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.710 [2024-11-02 12:08:10.465549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.710 [2024-11-02 12:08:10.465664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.710 [2024-11-02 12:08:10.465685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.710 #17 NEW cov: 11847 ft: 13448 corp: 12/388b lim: 50 exec/s: 0 rss: 68Mb L: 37/45 MS: 1 InsertRepeatedBytes- 00:08:23.710 [2024-11-02 12:08:10.505455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.710 [2024-11-02 12:08:10.505485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.710 [2024-11-02 12:08:10.505595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.710 [2024-11-02 12:08:10.505618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.710 [2024-11-02 12:08:10.505742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.711 [2024-11-02 12:08:10.505763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.711 #23 NEW cov: 11847 ft: 13461 corp: 13/425b lim: 50 exec/s: 0 rss: 68Mb L: 37/45 MS: 1 ShuffleBytes- 00:08:23.711 [2024-11-02 12:08:10.555208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.711 [2024-11-02 12:08:10.555238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.711 #27 NEW cov: 11847 ft: 14257 corp: 14/439b lim: 50 exec/s: 0 rss: 68Mb L: 14/45 MS: 4 InsertByte-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:23.711 [2024-11-02 12:08:10.595350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.711 [2024-11-02 12:08:10.595383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.711 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.711 #28 NEW cov: 11870 ft: 14292 corp: 15/453b lim: 50 exec/s: 0 rss: 68Mb L: 14/45 MS: 1 ChangeByte- 00:08:23.711 [2024-11-02 12:08:10.645812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.711 [2024-11-02 12:08:10.645846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.711 [2024-11-02 12:08:10.645949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.711 [2024-11-02 12:08:10.645967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.711 [2024-11-02 12:08:10.646091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:23.711 [2024-11-02 12:08:10.646112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.711 #29 NEW cov: 11870 ft: 14331 corp: 16/491b lim: 50 exec/s: 0 rss: 68Mb L: 38/45 MS: 1 InsertByte- 00:08:23.711 [2024-11-02 12:08:10.686105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:23.711 [2024-11-02 12:08:10.686139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.711 [2024-11-02 12:08:10.686242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:23.711 [2024-11-02 12:08:10.686264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.686387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.024 [2024-11-02 12:08:10.686406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.024 #30 NEW cov: 11870 ft: 14367 corp: 17/526b lim: 50 exec/s: 0 rss: 68Mb L: 35/45 MS: 1 InsertRepeatedBytes- 00:08:24.024 [2024-11-02 12:08:10.726068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.024 [2024-11-02 12:08:10.726100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.726194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.024 [2024-11-02 12:08:10.726217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.726340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.024 [2024-11-02 12:08:10.726364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.024 #31 NEW cov: 11870 ft: 14413 corp: 18/559b lim: 50 exec/s: 31 rss: 68Mb L: 33/45 MS: 1 InsertByte- 00:08:24.024 [2024-11-02 12:08:10.765983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.024 [2024-11-02 12:08:10.766021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.766161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.024 [2024-11-02 12:08:10.766183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.024 #32 NEW cov: 11870 ft: 14672 corp: 19/586b lim: 50 exec/s: 32 rss: 68Mb L: 27/45 MS: 1 EraseBytes- 00:08:24.024 [2024-11-02 12:08:10.806465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.024 [2024-11-02 12:08:10.806495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.806586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.024 [2024-11-02 12:08:10.806607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.806736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.024 [2024-11-02 12:08:10.806757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.024 #33 NEW cov: 11870 ft: 14682 corp: 20/621b lim: 50 exec/s: 33 rss: 68Mb L: 35/45 MS: 1 ChangeByte- 00:08:24.024 [2024-11-02 12:08:10.846450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.024 [2024-11-02 12:08:10.846479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.846544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.024 [2024-11-02 12:08:10.846560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.846676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.024 [2024-11-02 12:08:10.846696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.024 #34 NEW cov: 11870 ft: 14716 corp: 21/653b lim: 50 exec/s: 34 rss: 68Mb L: 32/45 MS: 1 ShuffleBytes- 00:08:24.024 [2024-11-02 12:08:10.886587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.024 [2024-11-02 12:08:10.886619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.886705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.024 [2024-11-02 12:08:10.886726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.886838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.024 [2024-11-02 12:08:10.886854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.024 #35 NEW cov: 11870 ft: 14732 corp: 22/689b lim: 50 exec/s: 35 rss: 68Mb L: 36/45 MS: 1 InsertByte- 00:08:24.024 [2024-11-02 12:08:10.926682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.024 [2024-11-02 12:08:10.926713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.926816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.024 [2024-11-02 12:08:10.926837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.024 [2024-11-02 12:08:10.926953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.024 [2024-11-02 12:08:10.926971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.024 #36 NEW cov: 11870 ft: 14740 corp: 23/725b lim: 50 exec/s: 36 rss: 68Mb L: 36/45 MS: 1 InsertByte- 00:08:24.391 [2024-11-02 12:08:10.966893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:10.966926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:10.967033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:10.967054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:10.967170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:10.967191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #37 NEW cov: 11870 ft: 14801 corp: 24/757b lim: 50 exec/s: 37 rss: 68Mb L: 32/45 MS: 1 ChangeBit- 00:08:24.391 [2024-11-02 12:08:11.006916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.006947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.007042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.007060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.007171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.007191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #38 NEW cov: 11870 ft: 14812 corp: 25/795b lim: 50 exec/s: 38 rss: 68Mb L: 38/45 MS: 1 CopyPart- 00:08:24.391 [2024-11-02 12:08:11.046925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.046950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.047074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.047097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 #39 NEW cov: 11870 ft: 14829 corp: 26/824b lim: 50 exec/s: 39 rss: 68Mb L: 29/45 MS: 1 CrossOver- 00:08:24.391 [2024-11-02 12:08:11.087311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.087341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.087452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.087473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.087590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.087607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #40 NEW cov: 11870 ft: 14884 corp: 27/856b lim: 50 exec/s: 40 rss: 68Mb L: 32/45 MS: 1 ChangeBit- 00:08:24.391 [2024-11-02 12:08:11.127340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.127369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.127467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.127490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.127610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.127628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #41 NEW cov: 11870 ft: 14899 corp: 28/888b lim: 50 exec/s: 41 rss: 69Mb L: 32/45 MS: 1 ChangeByte- 00:08:24.391 [2024-11-02 12:08:11.167447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.167476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.167580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.167603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.167715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.167733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #42 NEW cov: 11870 ft: 14914 corp: 29/925b lim: 50 exec/s: 42 rss: 69Mb L: 37/45 MS: 1 ChangeBit- 00:08:24.391 [2024-11-02 12:08:11.207616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.207651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.207767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.207790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.207905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.207926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #43 NEW cov: 11870 ft: 14928 corp: 30/961b lim: 50 exec/s: 43 rss: 69Mb L: 36/45 MS: 1 CopyPart- 00:08:24.391 [2024-11-02 12:08:11.257758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.257789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.257896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.257919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.258042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.258064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #44 NEW cov: 11870 ft: 14943 corp: 31/991b lim: 50 exec/s: 44 rss: 69Mb L: 30/45 MS: 1 EraseBytes- 00:08:24.391 [2024-11-02 12:08:11.307977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.308015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.308141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.308165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.308283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.308304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 #45 NEW cov: 11870 ft: 14981 corp: 32/1023b lim: 50 exec/s: 45 rss: 69Mb L: 32/45 MS: 1 ChangeByte- 00:08:24.391 [2024-11-02 12:08:11.358399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.391 [2024-11-02 12:08:11.358430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.358502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.391 [2024-11-02 12:08:11.358523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.358643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.391 [2024-11-02 12:08:11.358665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.391 [2024-11-02 12:08:11.358786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:24.391 [2024-11-02 12:08:11.358804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.650 #46 NEW cov: 11870 ft: 14983 corp: 33/1065b lim: 50 exec/s: 46 rss: 69Mb L: 42/45 MS: 1 InsertRepeatedBytes- 00:08:24.650 [2024-11-02 12:08:11.398174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.650 [2024-11-02 12:08:11.398206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.650 [2024-11-02 12:08:11.398291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.650 [2024-11-02 12:08:11.398313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.650 [2024-11-02 12:08:11.398425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.650 [2024-11-02 12:08:11.398447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.650 #47 NEW cov: 11870 ft: 15014 corp: 34/1101b lim: 50 exec/s: 47 rss: 69Mb L: 36/45 MS: 1 CopyPart- 00:08:24.650 [2024-11-02 12:08:11.438332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.651 [2024-11-02 12:08:11.438364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.438466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.651 [2024-11-02 12:08:11.438490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.438602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.651 [2024-11-02 12:08:11.438626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.651 #48 NEW cov: 11870 ft: 15032 corp: 35/1139b lim: 50 exec/s: 48 rss: 69Mb L: 38/45 MS: 1 ChangeBinInt- 00:08:24.651 [2024-11-02 12:08:11.488317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.651 [2024-11-02 12:08:11.488347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.488478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.651 [2024-11-02 12:08:11.488502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.488632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.651 [2024-11-02 12:08:11.488655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.651 #49 NEW cov: 11870 ft: 15038 corp: 36/1176b lim: 50 exec/s: 49 rss: 69Mb L: 37/45 MS: 1 ShuffleBytes- 00:08:24.651 [2024-11-02 12:08:11.528558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.651 [2024-11-02 12:08:11.528589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.528720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.651 [2024-11-02 12:08:11.528742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.528869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.651 [2024-11-02 12:08:11.528894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.651 #50 NEW cov: 11870 ft: 15045 corp: 37/1212b lim: 50 exec/s: 50 rss: 69Mb L: 36/45 MS: 1 ChangeBit- 00:08:24.651 [2024-11-02 12:08:11.568680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.651 [2024-11-02 12:08:11.568713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.568809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.651 [2024-11-02 12:08:11.568832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.568950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.651 [2024-11-02 12:08:11.568972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.651 #51 NEW cov: 11870 ft: 15061 corp: 38/1244b lim: 50 exec/s: 51 rss: 69Mb L: 32/45 MS: 1 ChangeBit- 00:08:24.651 [2024-11-02 12:08:11.608686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.651 [2024-11-02 12:08:11.608716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.608811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.651 [2024-11-02 12:08:11.608831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.651 [2024-11-02 12:08:11.608940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.651 [2024-11-02 12:08:11.608962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.911 #52 NEW cov: 11870 ft: 15072 corp: 39/1277b lim: 50 exec/s: 52 rss: 69Mb L: 33/45 MS: 1 InsertByte- 00:08:24.911 [2024-11-02 12:08:11.648645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.911 [2024-11-02 12:08:11.648678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.911 [2024-11-02 12:08:11.648791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.911 [2024-11-02 12:08:11.648817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.911 #53 NEW cov: 11870 ft: 15089 corp: 40/1301b lim: 50 exec/s: 53 rss: 69Mb L: 24/45 MS: 1 CrossOver- 00:08:24.911 [2024-11-02 12:08:11.689263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.911 [2024-11-02 12:08:11.689295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.911 [2024-11-02 12:08:11.689400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.911 [2024-11-02 12:08:11.689422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.911 [2024-11-02 12:08:11.689534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.911 [2024-11-02 12:08:11.689556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.911 [2024-11-02 12:08:11.689685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:24.911 [2024-11-02 12:08:11.689705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.911 #54 NEW cov: 11870 ft: 15097 corp: 41/1346b lim: 50 exec/s: 54 rss: 69Mb L: 45/45 MS: 1 ShuffleBytes- 00:08:24.911 [2024-11-02 12:08:11.739440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:24.911 [2024-11-02 12:08:11.739472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.911 [2024-11-02 12:08:11.739572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:24.911 [2024-11-02 12:08:11.739593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.911 [2024-11-02 12:08:11.739715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:24.911 [2024-11-02 12:08:11.739737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.911 [2024-11-02 12:08:11.739862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:24.911 [2024-11-02 12:08:11.739884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.911 #56 NEW cov: 11870 ft: 15108 corp: 42/1394b lim: 50 exec/s: 28 rss: 69Mb L: 48/48 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:24.911 #56 DONE cov: 11870 ft: 15108 corp: 42/1394b lim: 50 exec/s: 28 rss: 69Mb 00:08:24.911 Done 56 runs in 2 second(s) 00:08:24.911 12:08:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:24.911 12:08:11 -- ../common.sh@72 -- # (( i++ )) 00:08:24.911 12:08:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.911 12:08:11 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:24.911 12:08:11 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:24.911 12:08:11 -- nvmf/run.sh@24 -- # local timen=1 00:08:24.911 12:08:11 -- nvmf/run.sh@25 -- # local core=0x1 00:08:24.911 12:08:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:24.911 12:08:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:24.911 12:08:11 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:25.171 12:08:11 -- nvmf/run.sh@29 -- # port=4422 00:08:25.171 12:08:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:25.171 12:08:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:25.171 12:08:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.171 12:08:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:25.171 [2024-11-02 12:08:11.924538] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:25.171 [2024-11-02 12:08:11.924625] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1153862 ] 00:08:25.171 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.171 [2024-11-02 12:08:12.101734] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.171 [2024-11-02 12:08:12.121024] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.171 [2024-11-02 12:08:12.121171] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.430 [2024-11-02 12:08:12.172650] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.430 [2024-11-02 12:08:12.188984] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:25.430 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.430 INFO: Seed: 1669079939 00:08:25.430 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:25.430 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:25.430 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:25.430 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.430 #2 INITED exec/s: 0 rss: 59Mb 00:08:25.430 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.430 This may also happen if the target rejected all inputs we tried so far 00:08:25.430 [2024-11-02 12:08:12.244035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.430 [2024-11-02 12:08:12.244067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.689 NEW_FUNC[1/672]: 0x478a98 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:25.689 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:25.689 #24 NEW cov: 11669 ft: 11670 corp: 2/24b lim: 85 exec/s: 0 rss: 67Mb L: 23/23 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:25.689 [2024-11-02 12:08:12.565490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.689 [2024-11-02 12:08:12.565539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.689 #25 NEW cov: 11782 ft: 12519 corp: 3/47b lim: 85 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 CopyPart- 00:08:25.689 [2024-11-02 12:08:12.615528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.689 [2024-11-02 12:08:12.615563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.689 #31 NEW cov: 11788 ft: 12760 corp: 4/70b lim: 85 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 ChangeByte- 00:08:25.689 [2024-11-02 12:08:12.655641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.689 [2024-11-02 12:08:12.655677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.949 #32 NEW cov: 11873 ft: 13054 corp: 5/94b lim: 85 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertByte- 00:08:25.949 [2024-11-02 12:08:12.695745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.949 [2024-11-02 12:08:12.695771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.949 #33 NEW cov: 11873 ft: 13133 corp: 6/126b lim: 85 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 CMP- DE: "_\031\260\210\3779\177\000"- 00:08:25.949 [2024-11-02 12:08:12.736632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.949 [2024-11-02 12:08:12.736667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.949 [2024-11-02 12:08:12.736762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:25.949 [2024-11-02 12:08:12.736787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.949 [2024-11-02 12:08:12.736905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:25.949 [2024-11-02 12:08:12.736929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.949 [2024-11-02 12:08:12.737042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:25.949 [2024-11-02 12:08:12.737066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.949 #34 NEW cov: 11873 ft: 14135 corp: 7/200b lim: 85 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:08:25.949 [2024-11-02 12:08:12.776684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.949 [2024-11-02 12:08:12.776720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.949 [2024-11-02 12:08:12.776824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:25.949 [2024-11-02 12:08:12.776847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.949 [2024-11-02 12:08:12.776962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:25.949 [2024-11-02 12:08:12.776984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.949 [2024-11-02 12:08:12.777100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:25.949 [2024-11-02 12:08:12.777124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.949 #35 NEW cov: 11873 ft: 14189 corp: 8/274b lim: 85 exec/s: 0 rss: 67Mb L: 74/74 MS: 1 ChangeBit- 00:08:25.949 [2024-11-02 12:08:12.826213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.949 [2024-11-02 12:08:12.826240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.949 #36 NEW cov: 11873 ft: 14243 corp: 9/298b lim: 85 exec/s: 0 rss: 67Mb L: 24/74 MS: 1 ChangeByte- 00:08:25.949 [2024-11-02 12:08:12.866325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.949 [2024-11-02 12:08:12.866354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.949 #37 NEW cov: 11873 ft: 14276 corp: 10/321b lim: 85 exec/s: 0 rss: 67Mb L: 23/74 MS: 1 ShuffleBytes- 00:08:25.949 [2024-11-02 12:08:12.906386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:25.949 [2024-11-02 12:08:12.906424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.209 #38 NEW cov: 11873 ft: 14329 corp: 11/344b lim: 85 exec/s: 0 rss: 67Mb L: 23/74 MS: 1 CopyPart- 00:08:26.209 [2024-11-02 12:08:12.957297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.209 [2024-11-02 12:08:12.957333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.209 [2024-11-02 12:08:12.957453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.209 [2024-11-02 12:08:12.957472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.209 [2024-11-02 12:08:12.957587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.209 [2024-11-02 12:08:12.957610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.209 [2024-11-02 12:08:12.957729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.209 [2024-11-02 12:08:12.957750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.209 #39 NEW cov: 11873 ft: 14356 corp: 12/426b lim: 85 exec/s: 0 rss: 67Mb L: 82/82 MS: 1 PersAutoDict- DE: "_\031\260\210\3779\177\000"- 00:08:26.209 [2024-11-02 12:08:12.996658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.209 [2024-11-02 12:08:12.996686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.209 #40 NEW cov: 11873 ft: 14393 corp: 13/459b lim: 85 exec/s: 0 rss: 68Mb L: 33/82 MS: 1 InsertByte- 00:08:26.209 [2024-11-02 12:08:13.046801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.209 [2024-11-02 12:08:13.046832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.209 #41 NEW cov: 11873 ft: 14403 corp: 14/491b lim: 85 exec/s: 0 rss: 68Mb L: 32/82 MS: 1 ChangeBinInt- 00:08:26.209 [2024-11-02 12:08:13.087001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.209 [2024-11-02 12:08:13.087035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.209 #42 NEW cov: 11873 ft: 14442 corp: 15/523b lim: 85 exec/s: 0 rss: 68Mb L: 32/82 MS: 1 CrossOver- 00:08:26.209 [2024-11-02 12:08:13.137062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.209 [2024-11-02 12:08:13.137088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.209 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.209 #43 NEW cov: 11896 ft: 14475 corp: 16/555b lim: 85 exec/s: 0 rss: 68Mb L: 32/82 MS: 1 ChangeBinInt- 00:08:26.475 [2024-11-02 12:08:13.187908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.475 [2024-11-02 12:08:13.187942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.475 [2024-11-02 12:08:13.188061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.475 [2024-11-02 12:08:13.188085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.475 [2024-11-02 12:08:13.188213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.475 [2024-11-02 12:08:13.188235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.475 [2024-11-02 12:08:13.188365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.475 [2024-11-02 12:08:13.188390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.475 #44 NEW cov: 11896 ft: 14482 corp: 17/630b lim: 85 exec/s: 0 rss: 68Mb L: 75/82 MS: 1 InsertByte- 00:08:26.475 [2024-11-02 12:08:13.238290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.475 [2024-11-02 12:08:13.238319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.475 [2024-11-02 12:08:13.238401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.476 [2024-11-02 12:08:13.238420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.476 [2024-11-02 12:08:13.238540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.476 [2024-11-02 12:08:13.238560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.476 [2024-11-02 12:08:13.238681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.476 [2024-11-02 12:08:13.238705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.476 [2024-11-02 12:08:13.238823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:26.476 [2024-11-02 12:08:13.238845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:26.476 #45 NEW cov: 11896 ft: 14530 corp: 18/715b lim: 85 exec/s: 45 rss: 68Mb L: 85/85 MS: 1 CrossOver- 00:08:26.476 [2024-11-02 12:08:13.287710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.476 [2024-11-02 12:08:13.287743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.476 [2024-11-02 12:08:13.287855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.476 [2024-11-02 12:08:13.287878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.476 #48 NEW cov: 11896 ft: 14834 corp: 19/765b lim: 85 exec/s: 48 rss: 68Mb L: 50/85 MS: 3 ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:26.476 [2024-11-02 12:08:13.328009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.477 [2024-11-02 12:08:13.328041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.328162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.477 [2024-11-02 12:08:13.328201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.328323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.477 [2024-11-02 12:08:13.328343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.477 #49 NEW cov: 11896 ft: 15109 corp: 20/821b lim: 85 exec/s: 49 rss: 68Mb L: 56/85 MS: 1 EraseBytes- 00:08:26.477 [2024-11-02 12:08:13.378453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.477 [2024-11-02 12:08:13.378485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.378570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.477 [2024-11-02 12:08:13.378593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.378713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.477 [2024-11-02 12:08:13.378739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.378855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.477 [2024-11-02 12:08:13.378873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.477 #50 NEW cov: 11896 ft: 15176 corp: 21/895b lim: 85 exec/s: 50 rss: 68Mb L: 74/85 MS: 1 ChangeByte- 00:08:26.477 [2024-11-02 12:08:13.418621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.477 [2024-11-02 12:08:13.418656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.418765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.477 [2024-11-02 12:08:13.418784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.418900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.477 [2024-11-02 12:08:13.418924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.477 [2024-11-02 12:08:13.419056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.477 [2024-11-02 12:08:13.419076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.477 #51 NEW cov: 11896 ft: 15208 corp: 22/978b lim: 85 exec/s: 51 rss: 68Mb L: 83/85 MS: 1 InsertByte- 00:08:26.746 [2024-11-02 12:08:13.458741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.746 [2024-11-02 12:08:13.458791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.746 [2024-11-02 12:08:13.458915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.746 [2024-11-02 12:08:13.458939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.746 [2024-11-02 12:08:13.459068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.746 [2024-11-02 12:08:13.459093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.746 [2024-11-02 12:08:13.459221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.746 [2024-11-02 12:08:13.459246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.746 #52 NEW cov: 11896 ft: 15224 corp: 23/1061b lim: 85 exec/s: 52 rss: 68Mb L: 83/85 MS: 1 ChangeBit- 00:08:26.746 [2024-11-02 12:08:13.508153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.746 [2024-11-02 12:08:13.508179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.746 #53 NEW cov: 11896 ft: 15232 corp: 24/1093b lim: 85 exec/s: 53 rss: 68Mb L: 32/85 MS: 1 PersAutoDict- DE: "_\031\260\210\3779\177\000"- 00:08:26.746 [2024-11-02 12:08:13.549034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.746 [2024-11-02 12:08:13.549064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.746 [2024-11-02 12:08:13.549159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.746 [2024-11-02 12:08:13.549181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.746 [2024-11-02 12:08:13.549301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.746 [2024-11-02 12:08:13.549327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.549448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.747 [2024-11-02 12:08:13.549467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.747 #54 NEW cov: 11896 ft: 15253 corp: 25/1177b lim: 85 exec/s: 54 rss: 68Mb L: 84/85 MS: 1 CopyPart- 00:08:26.747 [2024-11-02 12:08:13.599365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.747 [2024-11-02 12:08:13.599398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.599494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.747 [2024-11-02 12:08:13.599517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.599631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.747 [2024-11-02 12:08:13.599654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.599766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.747 [2024-11-02 12:08:13.599789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.599907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:26.747 [2024-11-02 12:08:13.599929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:26.747 #55 NEW cov: 11896 ft: 15272 corp: 26/1262b lim: 85 exec/s: 55 rss: 68Mb L: 85/85 MS: 1 ChangeBit- 00:08:26.747 [2024-11-02 12:08:13.639439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.747 [2024-11-02 12:08:13.639470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.639555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.747 [2024-11-02 12:08:13.639573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.639690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:26.747 [2024-11-02 12:08:13.639713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.639836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:26.747 [2024-11-02 12:08:13.639859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.639984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:26.747 [2024-11-02 12:08:13.640012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:26.747 #56 NEW cov: 11896 ft: 15297 corp: 27/1347b lim: 85 exec/s: 56 rss: 68Mb L: 85/85 MS: 1 ChangeBinInt- 00:08:26.747 [2024-11-02 12:08:13.678934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:26.747 [2024-11-02 12:08:13.678963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.747 [2024-11-02 12:08:13.679084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:26.747 [2024-11-02 12:08:13.679106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.747 #57 NEW cov: 11896 ft: 15310 corp: 28/1387b lim: 85 exec/s: 57 rss: 69Mb L: 40/85 MS: 1 PersAutoDict- DE: "_\031\260\210\3779\177\000"- 00:08:27.006 [2024-11-02 12:08:13.729484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.006 [2024-11-02 12:08:13.729518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.006 [2024-11-02 12:08:13.729618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.006 [2024-11-02 12:08:13.729641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.006 [2024-11-02 12:08:13.729764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.006 [2024-11-02 12:08:13.729785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.006 [2024-11-02 12:08:13.729898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:27.006 [2024-11-02 12:08:13.729921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.006 #58 NEW cov: 11896 ft: 15318 corp: 29/1470b lim: 85 exec/s: 58 rss: 69Mb L: 83/85 MS: 1 ChangeByte- 00:08:27.006 [2024-11-02 12:08:13.768901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.006 [2024-11-02 12:08:13.768935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.006 #59 NEW cov: 11896 ft: 15329 corp: 30/1503b lim: 85 exec/s: 59 rss: 69Mb L: 33/85 MS: 1 ChangeASCIIInt- 00:08:27.006 [2024-11-02 12:08:13.809573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.006 [2024-11-02 12:08:13.809605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.006 [2024-11-02 12:08:13.809690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.006 [2024-11-02 12:08:13.809714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.006 [2024-11-02 12:08:13.809834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.006 [2024-11-02 12:08:13.809857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.006 #60 NEW cov: 11896 ft: 15364 corp: 31/1560b lim: 85 exec/s: 60 rss: 69Mb L: 57/85 MS: 1 InsertByte- 00:08:27.006 [2024-11-02 12:08:13.859487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.006 [2024-11-02 12:08:13.859521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.006 [2024-11-02 12:08:13.859638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.006 [2024-11-02 12:08:13.859660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.006 #61 NEW cov: 11896 ft: 15366 corp: 32/1610b lim: 85 exec/s: 61 rss: 69Mb L: 50/85 MS: 1 CopyPart- 00:08:27.006 [2024-11-02 12:08:13.899648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.006 [2024-11-02 12:08:13.899679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.006 [2024-11-02 12:08:13.899801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.006 [2024-11-02 12:08:13.899823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.006 #62 NEW cov: 11896 ft: 15388 corp: 33/1660b lim: 85 exec/s: 62 rss: 69Mb L: 50/85 MS: 1 ChangeByte- 00:08:27.006 [2024-11-02 12:08:13.940187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.007 [2024-11-02 12:08:13.940217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.007 [2024-11-02 12:08:13.940303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.007 [2024-11-02 12:08:13.940324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.007 [2024-11-02 12:08:13.940437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.007 [2024-11-02 12:08:13.940460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.007 [2024-11-02 12:08:13.940578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:27.007 [2024-11-02 12:08:13.940599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.007 #63 NEW cov: 11896 ft: 15419 corp: 34/1734b lim: 85 exec/s: 63 rss: 69Mb L: 74/85 MS: 1 CrossOver- 00:08:27.265 [2024-11-02 12:08:13.990107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.265 [2024-11-02 12:08:13.990142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.265 [2024-11-02 12:08:13.990254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.265 [2024-11-02 12:08:13.990277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.265 [2024-11-02 12:08:13.990391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.265 [2024-11-02 12:08:13.990413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.265 #64 NEW cov: 11896 ft: 15431 corp: 35/1792b lim: 85 exec/s: 64 rss: 69Mb L: 58/85 MS: 1 PersAutoDict- DE: "_\031\260\210\3779\177\000"- 00:08:27.265 [2024-11-02 12:08:14.040557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.265 [2024-11-02 12:08:14.040589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.265 [2024-11-02 12:08:14.040666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.265 [2024-11-02 12:08:14.040688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.265 [2024-11-02 12:08:14.040807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.265 [2024-11-02 12:08:14.040827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.265 [2024-11-02 12:08:14.040942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:27.266 [2024-11-02 12:08:14.040965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.266 #65 NEW cov: 11896 ft: 15467 corp: 36/1875b lim: 85 exec/s: 65 rss: 69Mb L: 83/85 MS: 1 ShuffleBytes- 00:08:27.266 [2024-11-02 12:08:14.079894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.266 [2024-11-02 12:08:14.079928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.266 #66 NEW cov: 11896 ft: 15493 corp: 37/1907b lim: 85 exec/s: 66 rss: 69Mb L: 32/85 MS: 1 ChangeBinInt- 00:08:27.266 [2024-11-02 12:08:14.119972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.266 [2024-11-02 12:08:14.120007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.266 #67 NEW cov: 11896 ft: 15504 corp: 38/1930b lim: 85 exec/s: 67 rss: 69Mb L: 23/85 MS: 1 ChangeBit- 00:08:27.266 [2024-11-02 12:08:14.160819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.266 [2024-11-02 12:08:14.160849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.266 [2024-11-02 12:08:14.160932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.266 [2024-11-02 12:08:14.160954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.266 [2024-11-02 12:08:14.161083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.266 [2024-11-02 12:08:14.161103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.266 [2024-11-02 12:08:14.161221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:27.266 [2024-11-02 12:08:14.161245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.266 #73 NEW cov: 11896 ft: 15515 corp: 39/1998b lim: 85 exec/s: 73 rss: 69Mb L: 68/85 MS: 1 EraseBytes- 00:08:27.266 [2024-11-02 12:08:14.210907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.266 [2024-11-02 12:08:14.210937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.266 [2024-11-02 12:08:14.211038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.266 [2024-11-02 12:08:14.211062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.266 [2024-11-02 12:08:14.211181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.266 [2024-11-02 12:08:14.211202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.266 [2024-11-02 12:08:14.211324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:27.266 [2024-11-02 12:08:14.211347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.266 #74 NEW cov: 11896 ft: 15526 corp: 40/2081b lim: 85 exec/s: 37 rss: 69Mb L: 83/85 MS: 1 CopyPart- 00:08:27.266 #74 DONE cov: 11896 ft: 15526 corp: 40/2081b lim: 85 exec/s: 37 rss: 69Mb 00:08:27.266 ###### Recommended dictionary. ###### 00:08:27.266 "_\031\260\210\3779\177\000" # Uses: 4 00:08:27.266 ###### End of recommended dictionary. ###### 00:08:27.266 Done 74 runs in 2 second(s) 00:08:27.526 12:08:14 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:27.526 12:08:14 -- ../common.sh@72 -- # (( i++ )) 00:08:27.526 12:08:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.526 12:08:14 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:27.526 12:08:14 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:27.526 12:08:14 -- nvmf/run.sh@24 -- # local timen=1 00:08:27.526 12:08:14 -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.526 12:08:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:27.526 12:08:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:27.526 12:08:14 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:27.526 12:08:14 -- nvmf/run.sh@29 -- # port=4423 00:08:27.526 12:08:14 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:27.526 12:08:14 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:27.526 12:08:14 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.526 12:08:14 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:27.526 [2024-11-02 12:08:14.396360] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:27.526 [2024-11-02 12:08:14.396429] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1154183 ] 00:08:27.526 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.785 [2024-11-02 12:08:14.571875] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.785 [2024-11-02 12:08:14.592393] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:27.785 [2024-11-02 12:08:14.592523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.785 [2024-11-02 12:08:14.643844] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:27.785 [2024-11-02 12:08:14.660226] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:27.785 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.785 INFO: Seed: 4139078694 00:08:27.785 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:27.785 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:27.785 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:27.785 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.785 #2 INITED exec/s: 0 rss: 59Mb 00:08:27.785 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.785 This may also happen if the target rejected all inputs we tried so far 00:08:27.785 [2024-11-02 12:08:14.726083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:27.785 [2024-11-02 12:08:14.726127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.785 [2024-11-02 12:08:14.726256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:27.785 [2024-11-02 12:08:14.726276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.044 NEW_FUNC[1/671]: 0x47bcd8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:28.044 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.044 #5 NEW cov: 11602 ft: 11602 corp: 2/15b lim: 25 exec/s: 0 rss: 67Mb L: 14/14 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:28.303 [2024-11-02 12:08:15.037287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.303 [2024-11-02 12:08:15.037328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.037462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.303 [2024-11-02 12:08:15.037492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.037636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.303 [2024-11-02 12:08:15.037663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.303 #11 NEW cov: 11715 ft: 12343 corp: 3/30b lim: 25 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 InsertByte- 00:08:28.303 [2024-11-02 12:08:15.097684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.303 [2024-11-02 12:08:15.097720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.097858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.303 [2024-11-02 12:08:15.097883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.098020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.303 [2024-11-02 12:08:15.098043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.098182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.303 [2024-11-02 12:08:15.098201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.303 #12 NEW cov: 11721 ft: 13018 corp: 4/54b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:28.303 [2024-11-02 12:08:15.147712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.303 [2024-11-02 12:08:15.147749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.147859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.303 [2024-11-02 12:08:15.147882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.148014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.303 [2024-11-02 12:08:15.148033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.148182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.303 [2024-11-02 12:08:15.148207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.303 #13 NEW cov: 11806 ft: 13494 corp: 5/78b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 ChangeBinInt- 00:08:28.303 [2024-11-02 12:08:15.207583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.303 [2024-11-02 12:08:15.207616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.207754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.303 [2024-11-02 12:08:15.207775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.303 #14 NEW cov: 11806 ft: 13545 corp: 6/89b lim: 25 exec/s: 0 rss: 67Mb L: 11/24 MS: 1 EraseBytes- 00:08:28.303 [2024-11-02 12:08:15.257626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.303 [2024-11-02 12:08:15.257660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.303 [2024-11-02 12:08:15.257808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.304 [2024-11-02 12:08:15.257834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.563 #15 NEW cov: 11806 ft: 13604 corp: 7/100b lim: 25 exec/s: 0 rss: 67Mb L: 11/24 MS: 1 CopyPart- 00:08:28.563 [2024-11-02 12:08:15.308557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.563 [2024-11-02 12:08:15.308588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.308673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.563 [2024-11-02 12:08:15.308695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.308835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.563 [2024-11-02 12:08:15.308857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.308997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.563 [2024-11-02 12:08:15.309018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.309163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:28.563 [2024-11-02 12:08:15.309184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.563 #16 NEW cov: 11806 ft: 13725 corp: 8/125b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:08:28.563 [2024-11-02 12:08:15.358598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.563 [2024-11-02 12:08:15.358635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.358747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.563 [2024-11-02 12:08:15.358768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.358907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.563 [2024-11-02 12:08:15.358932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.359030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.563 [2024-11-02 12:08:15.359056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.359204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:28.563 [2024-11-02 12:08:15.359231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.563 #17 NEW cov: 11806 ft: 13782 corp: 9/150b lim: 25 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 InsertByte- 00:08:28.563 [2024-11-02 12:08:15.418629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.563 [2024-11-02 12:08:15.418662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.418773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.563 [2024-11-02 12:08:15.418797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.418934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.563 [2024-11-02 12:08:15.418957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.419098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.563 [2024-11-02 12:08:15.419122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.563 #18 NEW cov: 11806 ft: 13801 corp: 10/174b lim: 25 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 CrossOver- 00:08:28.563 [2024-11-02 12:08:15.468335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.563 [2024-11-02 12:08:15.468372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.468527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.563 [2024-11-02 12:08:15.468557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.563 #20 NEW cov: 11806 ft: 13850 corp: 11/184b lim: 25 exec/s: 0 rss: 67Mb L: 10/25 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:28.563 [2024-11-02 12:08:15.518934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.563 [2024-11-02 12:08:15.518968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.519076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.563 [2024-11-02 12:08:15.519102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.519235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.563 [2024-11-02 12:08:15.519258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.563 [2024-11-02 12:08:15.519393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.563 [2024-11-02 12:08:15.519413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.822 #21 NEW cov: 11806 ft: 13926 corp: 12/208b lim: 25 exec/s: 0 rss: 67Mb L: 24/25 MS: 1 CrossOver- 00:08:28.822 [2024-11-02 12:08:15.568574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.822 [2024-11-02 12:08:15.568607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.822 #22 NEW cov: 11806 ft: 14372 corp: 13/215b lim: 25 exec/s: 0 rss: 67Mb L: 7/25 MS: 1 EraseBytes- 00:08:28.822 [2024-11-02 12:08:15.619366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.822 [2024-11-02 12:08:15.619402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.822 [2024-11-02 12:08:15.619536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.822 [2024-11-02 12:08:15.619559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.619694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.823 [2024-11-02 12:08:15.619717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.619852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.823 [2024-11-02 12:08:15.619878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.823 #23 NEW cov: 11806 ft: 14379 corp: 14/235b lim: 25 exec/s: 0 rss: 67Mb L: 20/25 MS: 1 EraseBytes- 00:08:28.823 [2024-11-02 12:08:15.679701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.823 [2024-11-02 12:08:15.679737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.679849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.823 [2024-11-02 12:08:15.679871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.679999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.823 [2024-11-02 12:08:15.680022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.680174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.823 [2024-11-02 12:08:15.680199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.680332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:28.823 [2024-11-02 12:08:15.680354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.823 #29 NEW cov: 11806 ft: 14400 corp: 15/260b lim: 25 exec/s: 29 rss: 67Mb L: 25/25 MS: 1 CMP- DE: "\372\377\377\377"- 00:08:28.823 [2024-11-02 12:08:15.739074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.823 [2024-11-02 12:08:15.739108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.739238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.823 [2024-11-02 12:08:15.739261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.823 #30 NEW cov: 11806 ft: 14424 corp: 16/272b lim: 25 exec/s: 30 rss: 67Mb L: 12/25 MS: 1 InsertByte- 00:08:28.823 [2024-11-02 12:08:15.789757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:28.823 [2024-11-02 12:08:15.789789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.789880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:28.823 [2024-11-02 12:08:15.789903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.790041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:28.823 [2024-11-02 12:08:15.790063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.823 [2024-11-02 12:08:15.790196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:28.823 [2024-11-02 12:08:15.790219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.082 #31 NEW cov: 11806 ft: 14465 corp: 17/292b lim: 25 exec/s: 31 rss: 67Mb L: 20/25 MS: 1 ChangeByte- 00:08:29.082 [2024-11-02 12:08:15.850289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.082 [2024-11-02 12:08:15.850326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.082 [2024-11-02 12:08:15.850423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.082 [2024-11-02 12:08:15.850458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.082 [2024-11-02 12:08:15.850596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.082 [2024-11-02 12:08:15.850621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.082 [2024-11-02 12:08:15.850770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:29.082 [2024-11-02 12:08:15.850791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.082 [2024-11-02 12:08:15.850901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:29.082 [2024-11-02 12:08:15.850927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.082 #32 NEW cov: 11806 ft: 14479 corp: 18/317b lim: 25 exec/s: 32 rss: 67Mb L: 25/25 MS: 1 CopyPart- 00:08:29.082 [2024-11-02 12:08:15.909618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.082 [2024-11-02 12:08:15.909646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.082 #35 NEW cov: 11806 ft: 14523 corp: 19/325b lim: 25 exec/s: 35 rss: 67Mb L: 8/25 MS: 3 ChangeByte-InsertByte-CrossOver- 00:08:29.082 [2024-11-02 12:08:15.959796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.082 [2024-11-02 12:08:15.959824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.082 #36 NEW cov: 11806 ft: 14555 corp: 20/332b lim: 25 exec/s: 36 rss: 67Mb L: 7/25 MS: 1 PersAutoDict- DE: "\372\377\377\377"- 00:08:29.082 [2024-11-02 12:08:16.020362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.082 [2024-11-02 12:08:16.020400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.082 [2024-11-02 12:08:16.020522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.082 [2024-11-02 12:08:16.020548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.082 [2024-11-02 12:08:16.020690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.082 [2024-11-02 12:08:16.020712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.082 #37 NEW cov: 11806 ft: 14561 corp: 21/350b lim: 25 exec/s: 37 rss: 67Mb L: 18/25 MS: 1 CrossOver- 00:08:29.341 [2024-11-02 12:08:16.070770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.341 [2024-11-02 12:08:16.070806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.070956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.341 [2024-11-02 12:08:16.070987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.071139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.341 [2024-11-02 12:08:16.071163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.071319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:29.341 [2024-11-02 12:08:16.071347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.341 #38 NEW cov: 11806 ft: 14672 corp: 22/371b lim: 25 exec/s: 38 rss: 67Mb L: 21/25 MS: 1 CopyPart- 00:08:29.341 [2024-11-02 12:08:16.131123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.341 [2024-11-02 12:08:16.131160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.131256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.341 [2024-11-02 12:08:16.131278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.131417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.341 [2024-11-02 12:08:16.131437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.131571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:29.341 [2024-11-02 12:08:16.131594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.131729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:29.341 [2024-11-02 12:08:16.131753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.341 #39 NEW cov: 11806 ft: 14676 corp: 23/396b lim: 25 exec/s: 39 rss: 68Mb L: 25/25 MS: 1 InsertByte- 00:08:29.341 [2024-11-02 12:08:16.180754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.341 [2024-11-02 12:08:16.180789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.180901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.341 [2024-11-02 12:08:16.180922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.181066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.341 [2024-11-02 12:08:16.181088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.341 #40 NEW cov: 11806 ft: 14682 corp: 24/411b lim: 25 exec/s: 40 rss: 68Mb L: 15/25 MS: 1 ChangeBit- 00:08:29.341 [2024-11-02 12:08:16.240868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.341 [2024-11-02 12:08:16.240901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.241052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.341 [2024-11-02 12:08:16.241086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.341 #41 NEW cov: 11806 ft: 14690 corp: 25/422b lim: 25 exec/s: 41 rss: 68Mb L: 11/25 MS: 1 ShuffleBytes- 00:08:29.341 [2024-11-02 12:08:16.291879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.341 [2024-11-02 12:08:16.291915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.292025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.341 [2024-11-02 12:08:16.292045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.292182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.341 [2024-11-02 12:08:16.292205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.292344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:29.341 [2024-11-02 12:08:16.292368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.341 [2024-11-02 12:08:16.292516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:29.341 [2024-11-02 12:08:16.292544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.601 #42 NEW cov: 11806 ft: 14711 corp: 26/447b lim: 25 exec/s: 42 rss: 68Mb L: 25/25 MS: 1 InsertByte- 00:08:29.601 [2024-11-02 12:08:16.351427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.601 [2024-11-02 12:08:16.351460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.351599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.601 [2024-11-02 12:08:16.351619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.351755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.601 [2024-11-02 12:08:16.351777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.601 #43 NEW cov: 11806 ft: 14722 corp: 27/462b lim: 25 exec/s: 43 rss: 68Mb L: 15/25 MS: 1 EraseBytes- 00:08:29.601 [2024-11-02 12:08:16.401131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.601 [2024-11-02 12:08:16.401159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.601 #44 NEW cov: 11806 ft: 14729 corp: 28/468b lim: 25 exec/s: 44 rss: 68Mb L: 6/25 MS: 1 EraseBytes- 00:08:29.601 [2024-11-02 12:08:16.451564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.601 [2024-11-02 12:08:16.451591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.451748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.601 [2024-11-02 12:08:16.451767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.601 #45 NEW cov: 11806 ft: 14763 corp: 29/480b lim: 25 exec/s: 45 rss: 68Mb L: 12/25 MS: 1 CrossOver- 00:08:29.601 [2024-11-02 12:08:16.502352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.601 [2024-11-02 12:08:16.502389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.502516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.601 [2024-11-02 12:08:16.502541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.502675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.601 [2024-11-02 12:08:16.502698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.502834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:29.601 [2024-11-02 12:08:16.502864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.503009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:29.601 [2024-11-02 12:08:16.503032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.601 #46 NEW cov: 11806 ft: 14774 corp: 30/505b lim: 25 exec/s: 46 rss: 68Mb L: 25/25 MS: 1 CopyPart- 00:08:29.601 [2024-11-02 12:08:16.562076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.601 [2024-11-02 12:08:16.562110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.562243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.601 [2024-11-02 12:08:16.562267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.601 [2024-11-02 12:08:16.562408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.601 [2024-11-02 12:08:16.562438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.860 #47 NEW cov: 11806 ft: 14802 corp: 31/520b lim: 25 exec/s: 47 rss: 68Mb L: 15/25 MS: 1 InsertRepeatedBytes- 00:08:29.860 [2024-11-02 12:08:16.612135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.860 [2024-11-02 12:08:16.612170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.860 [2024-11-02 12:08:16.612325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.860 [2024-11-02 12:08:16.612346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.861 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.861 #48 NEW cov: 11829 ft: 14843 corp: 32/531b lim: 25 exec/s: 48 rss: 68Mb L: 11/25 MS: 1 CopyPart- 00:08:29.861 [2024-11-02 12:08:16.672880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.861 [2024-11-02 12:08:16.672915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.861 [2024-11-02 12:08:16.673023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.861 [2024-11-02 12:08:16.673048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.861 [2024-11-02 12:08:16.673184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:29.861 [2024-11-02 12:08:16.673206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.861 [2024-11-02 12:08:16.673348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:29.861 [2024-11-02 12:08:16.673368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.861 [2024-11-02 12:08:16.673509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:29.861 [2024-11-02 12:08:16.673535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.861 #49 NEW cov: 11829 ft: 14859 corp: 33/556b lim: 25 exec/s: 49 rss: 68Mb L: 25/25 MS: 1 CopyPart- 00:08:29.861 [2024-11-02 12:08:16.732401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:29.861 [2024-11-02 12:08:16.732437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.861 [2024-11-02 12:08:16.732581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:29.861 [2024-11-02 12:08:16.732605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.861 #50 NEW cov: 11829 ft: 14876 corp: 34/568b lim: 25 exec/s: 25 rss: 68Mb L: 12/25 MS: 1 InsertRepeatedBytes- 00:08:29.861 #50 DONE cov: 11829 ft: 14876 corp: 34/568b lim: 25 exec/s: 25 rss: 68Mb 00:08:29.861 ###### Recommended dictionary. ###### 00:08:29.861 "\372\377\377\377" # Uses: 1 00:08:29.861 ###### End of recommended dictionary. ###### 00:08:29.861 Done 50 runs in 2 second(s) 00:08:30.120 12:08:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:30.120 12:08:16 -- ../common.sh@72 -- # (( i++ )) 00:08:30.120 12:08:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.120 12:08:16 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:30.120 12:08:16 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:30.120 12:08:16 -- nvmf/run.sh@24 -- # local timen=1 00:08:30.120 12:08:16 -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.120 12:08:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:30.120 12:08:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:30.120 12:08:16 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:30.120 12:08:16 -- nvmf/run.sh@29 -- # port=4424 00:08:30.120 12:08:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:30.120 12:08:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:30.120 12:08:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.120 12:08:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:30.120 [2024-11-02 12:08:16.915448] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:30.120 [2024-11-02 12:08:16.915517] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1154691 ] 00:08:30.120 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.120 [2024-11-02 12:08:17.089127] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.380 [2024-11-02 12:08:17.109267] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.380 [2024-11-02 12:08:17.109403] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.380 [2024-11-02 12:08:17.160716] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.380 [2024-11-02 12:08:17.177051] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:30.380 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.380 INFO: Seed: 2363109010 00:08:30.380 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:30.380 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:30.380 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:30.380 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.380 #2 INITED exec/s: 0 rss: 59Mb 00:08:30.380 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.380 This may also happen if the target rejected all inputs we tried so far 00:08:30.380 [2024-11-02 12:08:17.231811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.380 [2024-11-02 12:08:17.231844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.380 [2024-11-02 12:08:17.231898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.380 [2024-11-02 12:08:17.231916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.380 [2024-11-02 12:08:17.231945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.380 [2024-11-02 12:08:17.231961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.380 [2024-11-02 12:08:17.231989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.380 [2024-11-02 12:08:17.232012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.642 NEW_FUNC[1/672]: 0x47cdc8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:30.642 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:30.642 #17 NEW cov: 11674 ft: 11675 corp: 2/92b lim: 100 exec/s: 0 rss: 67Mb L: 91/91 MS: 5 CopyPart-ChangeBit-CMP-ChangeBinInt-InsertRepeatedBytes- DE: ")\000\000\000"- 00:08:30.642 [2024-11-02 12:08:17.552609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.642 [2024-11-02 12:08:17.552650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.642 [2024-11-02 12:08:17.552685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.642 [2024-11-02 12:08:17.552702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.642 [2024-11-02 12:08:17.552731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.642 [2024-11-02 12:08:17.552747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.642 [2024-11-02 12:08:17.552774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193307922857984 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.642 [2024-11-02 12:08:17.552790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.642 #18 NEW cov: 11787 ft: 12219 corp: 3/183b lim: 100 exec/s: 0 rss: 67Mb L: 91/91 MS: 1 PersAutoDict- DE: ")\000\000\000"- 00:08:30.904 [2024-11-02 12:08:17.622686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.622717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.622751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.622768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.622798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.622814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.622847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.622863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.904 #19 NEW cov: 11793 ft: 12423 corp: 4/274b lim: 100 exec/s: 0 rss: 67Mb L: 91/91 MS: 1 ChangeBit- 00:08:30.904 [2024-11-02 12:08:17.672747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.672778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.672826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.672844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.672873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.672889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.672917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.672933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.904 #20 NEW cov: 11878 ft: 12640 corp: 5/365b lim: 100 exec/s: 0 rss: 67Mb L: 91/91 MS: 1 ChangeBinInt- 00:08:30.904 [2024-11-02 12:08:17.742926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.742956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.743012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.743030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.743060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.743076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.743104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.743120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.904 #21 NEW cov: 11878 ft: 12719 corp: 6/462b lim: 100 exec/s: 0 rss: 67Mb L: 97/97 MS: 1 InsertRepeatedBytes- 00:08:30.904 [2024-11-02 12:08:17.793061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:31310 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.793091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.793138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.793160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.793190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.793206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.793234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.793249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.904 #22 NEW cov: 11878 ft: 12801 corp: 7/553b lim: 100 exec/s: 0 rss: 67Mb L: 91/97 MS: 1 ChangeByte- 00:08:30.904 [2024-11-02 12:08:17.853290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.853322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.853356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.853374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.853404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570153395400822093 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.853420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.904 [2024-11-02 12:08:17.853449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:30.904 [2024-11-02 12:08:17.853465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.164 #23 NEW cov: 11878 ft: 12898 corp: 8/644b lim: 100 exec/s: 0 rss: 67Mb L: 91/97 MS: 1 CopyPart- 00:08:31.164 [2024-11-02 12:08:17.923376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:17.923406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:17.923439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:17.923456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:17.923486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:17.923502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.164 #24 NEW cov: 11878 ft: 13372 corp: 9/719b lim: 100 exec/s: 0 rss: 67Mb L: 75/97 MS: 1 EraseBytes- 00:08:31.164 [2024-11-02 12:08:17.973528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:17.973557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:17.973603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:17.973624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:17.973654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:17.973670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:17.973698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:17.973714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.164 #25 NEW cov: 11878 ft: 13390 corp: 10/816b lim: 100 exec/s: 0 rss: 67Mb L: 97/97 MS: 1 CMP- DE: "\000\001\000\000"- 00:08:31.164 [2024-11-02 12:08:18.033579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:18.033607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:18.033655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:18.033672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.164 #26 NEW cov: 11878 ft: 13768 corp: 11/864b lim: 100 exec/s: 0 rss: 67Mb L: 48/97 MS: 1 EraseBytes- 00:08:31.164 [2024-11-02 12:08:18.093877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:18.093905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:18.093953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:18.093970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:18.094008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:18.094025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.164 [2024-11-02 12:08:18.094054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.164 [2024-11-02 12:08:18.094069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.164 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.164 #27 NEW cov: 11901 ft: 13817 corp: 12/963b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 CrossOver- 00:08:31.423 [2024-11-02 12:08:18.143967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.423 [2024-11-02 12:08:18.144010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.423 [2024-11-02 12:08:18.144045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.423 [2024-11-02 12:08:18.144063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.423 [2024-11-02 12:08:18.144097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.423 [2024-11-02 12:08:18.144114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.423 #28 NEW cov: 11901 ft: 13856 corp: 13/1038b lim: 100 exec/s: 0 rss: 68Mb L: 75/99 MS: 1 ChangeBinInt- 00:08:31.423 [2024-11-02 12:08:18.214165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.423 [2024-11-02 12:08:18.214194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.423 [2024-11-02 12:08:18.214242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.423 [2024-11-02 12:08:18.214259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.423 [2024-11-02 12:08:18.214287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531510605 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.214303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.214331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.214347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.424 #29 NEW cov: 11901 ft: 13875 corp: 14/1136b lim: 100 exec/s: 29 rss: 68Mb L: 98/99 MS: 1 InsertByte- 00:08:31.424 [2024-11-02 12:08:18.284385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.284414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.284461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.284478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.284508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.284523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.284551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.284567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.424 #30 NEW cov: 11901 ft: 13995 corp: 15/1235b lim: 100 exec/s: 30 rss: 68Mb L: 99/99 MS: 1 CrossOver- 00:08:31.424 [2024-11-02 12:08:18.344579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308529544525 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.344608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.344655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526446441 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.344672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.344706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.344722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.344750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.344765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.424 [2024-11-02 12:08:18.344792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.424 [2024-11-02 12:08:18.344808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:31.424 #31 NEW cov: 11901 ft: 14119 corp: 16/1335b lim: 100 exec/s: 31 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:08:31.683 [2024-11-02 12:08:18.414724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.414755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.683 [2024-11-02 12:08:18.414789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.414808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.683 [2024-11-02 12:08:18.414839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.414856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.683 [2024-11-02 12:08:18.414899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531510605 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.414916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.683 #32 NEW cov: 11901 ft: 14133 corp: 17/1433b lim: 100 exec/s: 32 rss: 68Mb L: 98/100 MS: 1 CopyPart- 00:08:31.683 [2024-11-02 12:08:18.484872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19912 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.484902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.683 [2024-11-02 12:08:18.484950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.484968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.683 [2024-11-02 12:08:18.485005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193153913081165 len:78 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.485022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.683 [2024-11-02 12:08:18.485050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.485066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.683 #33 NEW cov: 11901 ft: 14202 corp: 18/1525b lim: 100 exec/s: 33 rss: 68Mb L: 92/100 MS: 1 InsertByte- 00:08:31.683 [2024-11-02 12:08:18.544921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.544951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.683 [2024-11-02 12:08:18.545006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.683 [2024-11-02 12:08:18.545024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.683 #34 NEW cov: 11901 ft: 14213 corp: 19/1577b lim: 100 exec/s: 34 rss: 68Mb L: 52/100 MS: 1 EraseBytes- 00:08:31.684 [2024-11-02 12:08:18.605234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.605264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.684 [2024-11-02 12:08:18.605298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.605315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.684 [2024-11-02 12:08:18.605345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.605361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.684 [2024-11-02 12:08:18.605388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.605404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.684 #35 NEW cov: 11901 ft: 14278 corp: 20/1676b lim: 100 exec/s: 35 rss: 68Mb L: 99/100 MS: 1 CopyPart- 00:08:31.684 [2024-11-02 12:08:18.655390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19758 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.655421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.684 [2024-11-02 12:08:18.655454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.655472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.684 [2024-11-02 12:08:18.655502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.655519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.684 [2024-11-02 12:08:18.655547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531510605 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.684 [2024-11-02 12:08:18.655563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.943 #36 NEW cov: 11901 ft: 14303 corp: 21/1774b lim: 100 exec/s: 36 rss: 68Mb L: 98/100 MS: 1 ChangeByte- 00:08:31.943 [2024-11-02 12:08:18.725515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.725549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.725583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.725600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.725629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193407316151629 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.725644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.943 #37 NEW cov: 11901 ft: 14370 corp: 22/1849b lim: 100 exec/s: 37 rss: 68Mb L: 75/100 MS: 1 ChangeByte- 00:08:31.943 [2024-11-02 12:08:18.775677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.775706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.775754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147526453609 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.775771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.775800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.775816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.775844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.775860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.943 #38 NEW cov: 11901 ft: 14431 corp: 23/1948b lim: 100 exec/s: 38 rss: 68Mb L: 99/100 MS: 1 CopyPart- 00:08:31.943 [2024-11-02 12:08:18.825789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19912 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.825818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.825865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.825882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.825911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193153913081165 len:78 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.825927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.825955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.825970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.943 #39 NEW cov: 11901 ft: 14448 corp: 24/2040b lim: 100 exec/s: 39 rss: 68Mb L: 92/100 MS: 1 ShuffleBytes- 00:08:31.943 [2024-11-02 12:08:18.885953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.885987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.886043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.886060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.886089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.886105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.943 [2024-11-02 12:08:18.886133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5576667232996248909 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.943 [2024-11-02 12:08:18.886149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.202 #40 NEW cov: 11901 ft: 14528 corp: 25/2133b lim: 100 exec/s: 40 rss: 69Mb L: 93/100 MS: 1 CopyPart- 00:08:32.202 [2024-11-02 12:08:18.956107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:18.956138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:18.956173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:18.956190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:18.956220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:7681 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:18.956236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.203 #41 NEW cov: 11901 ft: 14553 corp: 26/2208b lim: 100 exec/s: 41 rss: 69Mb L: 75/100 MS: 1 CMP- DE: "\036\000"- 00:08:32.203 [2024-11-02 12:08:19.006308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.006338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.006371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13310591800410945720 len:47289 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.006389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.006419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193310334078285 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.006435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.006462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.006494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.203 #42 NEW cov: 11901 ft: 14563 corp: 27/2304b lim: 100 exec/s: 42 rss: 69Mb L: 96/100 MS: 1 InsertRepeatedBytes- 00:08:32.203 [2024-11-02 12:08:19.056450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.056484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.056519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.056536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.056564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.056580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.056607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.056623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.203 #43 NEW cov: 11901 ft: 14600 corp: 28/2403b lim: 100 exec/s: 43 rss: 69Mb L: 99/100 MS: 1 CrossOver- 00:08:32.203 [2024-11-02 12:08:19.106461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.106491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.106524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.106542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.203 #44 NEW cov: 11901 ft: 14608 corp: 29/2456b lim: 100 exec/s: 44 rss: 69Mb L: 53/100 MS: 1 InsertByte- 00:08:32.203 [2024-11-02 12:08:19.176778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5570193308529806669 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.176809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.176842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.176860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.176890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.176906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.203 [2024-11-02 12:08:19.176935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:5570193308531903821 len:19790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.203 [2024-11-02 12:08:19.176951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.462 #45 NEW cov: 11901 ft: 14612 corp: 30/2547b lim: 100 exec/s: 22 rss: 69Mb L: 91/100 MS: 1 ChangeByte- 00:08:32.462 #45 DONE cov: 11901 ft: 14612 corp: 30/2547b lim: 100 exec/s: 22 rss: 69Mb 00:08:32.462 ###### Recommended dictionary. ###### 00:08:32.462 ")\000\000\000" # Uses: 1 00:08:32.462 "\000\001\000\000" # Uses: 0 00:08:32.462 "\036\000" # Uses: 0 00:08:32.462 ###### End of recommended dictionary. ###### 00:08:32.462 Done 45 runs in 2 second(s) 00:08:32.462 12:08:19 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:32.462 12:08:19 -- ../common.sh@72 -- # (( i++ )) 00:08:32.462 12:08:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.462 12:08:19 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:32.462 00:08:32.462 real 1m4.045s 00:08:32.462 user 1m39.172s 00:08:32.462 sys 0m8.560s 00:08:32.462 12:08:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:32.462 12:08:19 -- common/autotest_common.sh@10 -- # set +x 00:08:32.462 ************************************ 00:08:32.462 END TEST nvmf_fuzz 00:08:32.462 ************************************ 00:08:32.462 12:08:19 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:32.462 12:08:19 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:32.462 12:08:19 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:32.462 12:08:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:32.462 12:08:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:32.462 12:08:19 -- common/autotest_common.sh@10 -- # set +x 00:08:32.462 ************************************ 00:08:32.462 START TEST vfio_fuzz 00:08:32.462 ************************************ 00:08:32.462 12:08:19 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:32.723 * Looking for test storage... 00:08:32.723 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:32.723 12:08:19 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:32.723 12:08:19 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:32.723 12:08:19 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:32.723 12:08:19 -- common/autotest_common.sh@34 -- # set -e 00:08:32.723 12:08:19 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:32.723 12:08:19 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:32.723 12:08:19 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:32.723 12:08:19 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:32.723 12:08:19 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:32.723 12:08:19 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:32.723 12:08:19 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:32.723 12:08:19 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:32.723 12:08:19 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:32.723 12:08:19 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:32.723 12:08:19 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:32.723 12:08:19 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:32.723 12:08:19 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:32.723 12:08:19 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:32.723 12:08:19 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:32.723 12:08:19 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:32.723 12:08:19 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:32.723 12:08:19 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:32.724 12:08:19 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:32.724 12:08:19 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:32.724 12:08:19 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:32.724 12:08:19 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:32.724 12:08:19 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:32.724 12:08:19 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:32.724 12:08:19 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:32.724 12:08:19 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:32.724 12:08:19 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:32.724 12:08:19 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:32.724 12:08:19 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:32.724 12:08:19 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:32.724 12:08:19 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:32.724 12:08:19 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:32.724 12:08:19 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:32.724 12:08:19 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:32.724 12:08:19 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:32.724 12:08:19 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:32.724 12:08:19 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:32.724 12:08:19 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:32.724 12:08:19 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:32.724 12:08:19 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:32.724 12:08:19 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:32.724 12:08:19 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:32.724 12:08:19 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:32.724 12:08:19 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:32.724 12:08:19 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:32.724 12:08:19 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:32.724 12:08:19 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:32.724 12:08:19 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:32.724 12:08:19 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:32.724 12:08:19 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:32.724 12:08:19 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:32.724 12:08:19 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:32.724 12:08:19 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:32.724 12:08:19 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:32.724 12:08:19 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:32.724 12:08:19 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:32.724 12:08:19 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:32.724 12:08:19 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:32.724 12:08:19 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:32.724 12:08:19 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:32.724 12:08:19 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:32.724 12:08:19 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:32.724 12:08:19 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:32.724 12:08:19 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:32.724 12:08:19 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:32.724 12:08:19 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:32.724 12:08:19 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:32.724 12:08:19 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:32.724 12:08:19 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:32.724 12:08:19 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:32.724 12:08:19 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:32.724 12:08:19 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:32.724 12:08:19 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:32.724 12:08:19 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:32.724 12:08:19 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:32.724 12:08:19 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:32.724 12:08:19 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:32.724 12:08:19 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:32.724 12:08:19 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:32.724 12:08:19 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:32.724 12:08:19 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:32.724 12:08:19 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:32.724 12:08:19 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:32.724 12:08:19 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:32.724 12:08:19 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:32.724 12:08:19 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:32.724 12:08:19 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:32.724 12:08:19 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:32.724 12:08:19 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:32.724 12:08:19 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:32.724 12:08:19 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:32.724 12:08:19 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:32.724 12:08:19 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:32.724 12:08:19 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:32.724 12:08:19 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:32.724 12:08:19 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:32.724 12:08:19 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:32.724 12:08:19 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:32.724 12:08:19 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:32.724 #define SPDK_CONFIG_H 00:08:32.724 #define SPDK_CONFIG_APPS 1 00:08:32.724 #define SPDK_CONFIG_ARCH native 00:08:32.724 #undef SPDK_CONFIG_ASAN 00:08:32.724 #undef SPDK_CONFIG_AVAHI 00:08:32.724 #undef SPDK_CONFIG_CET 00:08:32.724 #define SPDK_CONFIG_COVERAGE 1 00:08:32.724 #define SPDK_CONFIG_CROSS_PREFIX 00:08:32.724 #undef SPDK_CONFIG_CRYPTO 00:08:32.724 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:32.724 #undef SPDK_CONFIG_CUSTOMOCF 00:08:32.724 #undef SPDK_CONFIG_DAOS 00:08:32.724 #define SPDK_CONFIG_DAOS_DIR 00:08:32.724 #define SPDK_CONFIG_DEBUG 1 00:08:32.724 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:32.724 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:32.724 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:32.724 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:32.724 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:32.724 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:32.724 #define SPDK_CONFIG_EXAMPLES 1 00:08:32.724 #undef SPDK_CONFIG_FC 00:08:32.724 #define SPDK_CONFIG_FC_PATH 00:08:32.724 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:32.724 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:32.724 #undef SPDK_CONFIG_FUSE 00:08:32.724 #define SPDK_CONFIG_FUZZER 1 00:08:32.724 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:32.724 #undef SPDK_CONFIG_GOLANG 00:08:32.724 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:32.724 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:32.724 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:32.724 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:32.724 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:32.724 #define SPDK_CONFIG_IDXD 1 00:08:32.724 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:32.724 #undef SPDK_CONFIG_IPSEC_MB 00:08:32.724 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:32.724 #define SPDK_CONFIG_ISAL 1 00:08:32.724 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:32.724 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:32.724 #define SPDK_CONFIG_LIBDIR 00:08:32.724 #undef SPDK_CONFIG_LTO 00:08:32.724 #define SPDK_CONFIG_MAX_LCORES 00:08:32.724 #define SPDK_CONFIG_NVME_CUSE 1 00:08:32.724 #undef SPDK_CONFIG_OCF 00:08:32.724 #define SPDK_CONFIG_OCF_PATH 00:08:32.724 #define SPDK_CONFIG_OPENSSL_PATH 00:08:32.724 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:32.724 #undef SPDK_CONFIG_PGO_USE 00:08:32.724 #define SPDK_CONFIG_PREFIX /usr/local 00:08:32.724 #undef SPDK_CONFIG_RAID5F 00:08:32.724 #undef SPDK_CONFIG_RBD 00:08:32.724 #define SPDK_CONFIG_RDMA 1 00:08:32.724 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:32.724 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:32.724 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:32.724 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:32.724 #undef SPDK_CONFIG_SHARED 00:08:32.724 #undef SPDK_CONFIG_SMA 00:08:32.724 #define SPDK_CONFIG_TESTS 1 00:08:32.724 #undef SPDK_CONFIG_TSAN 00:08:32.724 #define SPDK_CONFIG_UBLK 1 00:08:32.724 #define SPDK_CONFIG_UBSAN 1 00:08:32.724 #undef SPDK_CONFIG_UNIT_TESTS 00:08:32.724 #undef SPDK_CONFIG_URING 00:08:32.724 #define SPDK_CONFIG_URING_PATH 00:08:32.724 #undef SPDK_CONFIG_URING_ZNS 00:08:32.724 #undef SPDK_CONFIG_USDT 00:08:32.724 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:32.724 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:32.724 #define SPDK_CONFIG_VFIO_USER 1 00:08:32.724 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:32.724 #define SPDK_CONFIG_VHOST 1 00:08:32.724 #define SPDK_CONFIG_VIRTIO 1 00:08:32.724 #undef SPDK_CONFIG_VTUNE 00:08:32.724 #define SPDK_CONFIG_VTUNE_DIR 00:08:32.724 #define SPDK_CONFIG_WERROR 1 00:08:32.724 #define SPDK_CONFIG_WPDK_DIR 00:08:32.724 #undef SPDK_CONFIG_XNVME 00:08:32.724 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:32.724 12:08:19 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:32.724 12:08:19 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:32.724 12:08:19 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:32.724 12:08:19 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:32.724 12:08:19 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:32.725 12:08:19 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.725 12:08:19 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.725 12:08:19 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.725 12:08:19 -- paths/export.sh@5 -- # export PATH 00:08:32.725 12:08:19 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.725 12:08:19 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:32.725 12:08:19 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:32.725 12:08:19 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:32.725 12:08:19 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:32.725 12:08:19 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:32.725 12:08:19 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:32.725 12:08:19 -- pm/common@16 -- # TEST_TAG=N/A 00:08:32.725 12:08:19 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:32.725 12:08:19 -- common/autotest_common.sh@52 -- # : 1 00:08:32.725 12:08:19 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:32.725 12:08:19 -- common/autotest_common.sh@56 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:32.725 12:08:19 -- common/autotest_common.sh@58 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:32.725 12:08:19 -- common/autotest_common.sh@60 -- # : 1 00:08:32.725 12:08:19 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:32.725 12:08:19 -- common/autotest_common.sh@62 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:32.725 12:08:19 -- common/autotest_common.sh@64 -- # : 00:08:32.725 12:08:19 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:32.725 12:08:19 -- common/autotest_common.sh@66 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:32.725 12:08:19 -- common/autotest_common.sh@68 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:32.725 12:08:19 -- common/autotest_common.sh@70 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:32.725 12:08:19 -- common/autotest_common.sh@72 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:32.725 12:08:19 -- common/autotest_common.sh@74 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:32.725 12:08:19 -- common/autotest_common.sh@76 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:32.725 12:08:19 -- common/autotest_common.sh@78 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:32.725 12:08:19 -- common/autotest_common.sh@80 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:32.725 12:08:19 -- common/autotest_common.sh@82 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:32.725 12:08:19 -- common/autotest_common.sh@84 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:32.725 12:08:19 -- common/autotest_common.sh@86 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:32.725 12:08:19 -- common/autotest_common.sh@88 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:32.725 12:08:19 -- common/autotest_common.sh@90 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:32.725 12:08:19 -- common/autotest_common.sh@92 -- # : 1 00:08:32.725 12:08:19 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:32.725 12:08:19 -- common/autotest_common.sh@94 -- # : 1 00:08:32.725 12:08:19 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:32.725 12:08:19 -- common/autotest_common.sh@96 -- # : rdma 00:08:32.725 12:08:19 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:32.725 12:08:19 -- common/autotest_common.sh@98 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:32.725 12:08:19 -- common/autotest_common.sh@100 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:32.725 12:08:19 -- common/autotest_common.sh@102 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:32.725 12:08:19 -- common/autotest_common.sh@104 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:32.725 12:08:19 -- common/autotest_common.sh@106 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:32.725 12:08:19 -- common/autotest_common.sh@108 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:32.725 12:08:19 -- common/autotest_common.sh@110 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:32.725 12:08:19 -- common/autotest_common.sh@112 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:32.725 12:08:19 -- common/autotest_common.sh@114 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:32.725 12:08:19 -- common/autotest_common.sh@116 -- # : 1 00:08:32.725 12:08:19 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:32.725 12:08:19 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:32.725 12:08:19 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:32.725 12:08:19 -- common/autotest_common.sh@120 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:32.725 12:08:19 -- common/autotest_common.sh@122 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:32.725 12:08:19 -- common/autotest_common.sh@124 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:32.725 12:08:19 -- common/autotest_common.sh@126 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:32.725 12:08:19 -- common/autotest_common.sh@128 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:32.725 12:08:19 -- common/autotest_common.sh@130 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:32.725 12:08:19 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:32.725 12:08:19 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:32.725 12:08:19 -- common/autotest_common.sh@134 -- # : true 00:08:32.725 12:08:19 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:32.725 12:08:19 -- common/autotest_common.sh@136 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:32.725 12:08:19 -- common/autotest_common.sh@138 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:32.725 12:08:19 -- common/autotest_common.sh@140 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:32.725 12:08:19 -- common/autotest_common.sh@142 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:32.725 12:08:19 -- common/autotest_common.sh@144 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:32.725 12:08:19 -- common/autotest_common.sh@146 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:32.725 12:08:19 -- common/autotest_common.sh@148 -- # : 00:08:32.725 12:08:19 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:32.725 12:08:19 -- common/autotest_common.sh@150 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:32.725 12:08:19 -- common/autotest_common.sh@152 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:32.725 12:08:19 -- common/autotest_common.sh@154 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:32.725 12:08:19 -- common/autotest_common.sh@156 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:32.725 12:08:19 -- common/autotest_common.sh@158 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:32.725 12:08:19 -- common/autotest_common.sh@160 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:32.725 12:08:19 -- common/autotest_common.sh@163 -- # : 00:08:32.725 12:08:19 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:32.725 12:08:19 -- common/autotest_common.sh@165 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:32.725 12:08:19 -- common/autotest_common.sh@167 -- # : 0 00:08:32.725 12:08:19 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:32.725 12:08:19 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:32.725 12:08:19 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:32.725 12:08:19 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:32.725 12:08:19 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:32.725 12:08:19 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.726 12:08:19 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.726 12:08:19 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.726 12:08:19 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.726 12:08:19 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:32.726 12:08:19 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:32.726 12:08:19 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:32.726 12:08:19 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:32.726 12:08:19 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:32.726 12:08:19 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:32.726 12:08:19 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:32.726 12:08:19 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:32.726 12:08:19 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:32.726 12:08:19 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:32.726 12:08:19 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:32.726 12:08:19 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:32.726 12:08:19 -- common/autotest_common.sh@196 -- # cat 00:08:32.726 12:08:19 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:32.726 12:08:19 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:32.726 12:08:19 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:32.726 12:08:19 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:32.726 12:08:19 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:32.726 12:08:19 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:32.726 12:08:19 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:32.726 12:08:19 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:32.726 12:08:19 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:32.726 12:08:19 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:32.726 12:08:19 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:32.726 12:08:19 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:32.726 12:08:19 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:32.726 12:08:19 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:32.726 12:08:19 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:32.726 12:08:19 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:32.726 12:08:19 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:32.726 12:08:19 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:32.726 12:08:19 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:32.726 12:08:19 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:32.726 12:08:19 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:32.726 12:08:19 -- common/autotest_common.sh@249 -- # valgrind= 00:08:32.726 12:08:19 -- common/autotest_common.sh@255 -- # uname -s 00:08:32.726 12:08:19 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:32.726 12:08:19 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:32.726 12:08:19 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:32.726 12:08:19 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:32.726 12:08:19 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:32.726 12:08:19 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:32.726 12:08:19 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:32.726 12:08:19 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:32.726 12:08:19 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:32.726 12:08:19 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:32.726 12:08:19 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:32.726 12:08:19 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:32.726 12:08:19 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:32.726 12:08:19 -- common/autotest_common.sh@309 -- # [[ -z 1155255 ]] 00:08:32.726 12:08:19 -- common/autotest_common.sh@309 -- # kill -0 1155255 00:08:32.726 12:08:19 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:32.726 12:08:19 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:32.726 12:08:19 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:32.726 12:08:19 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:32.726 12:08:19 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:32.726 12:08:19 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:32.726 12:08:19 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:32.726 12:08:19 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:32.726 12:08:19 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.Ov9uKY 00:08:32.726 12:08:19 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:32.726 12:08:19 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:32.726 12:08:19 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:32.726 12:08:19 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.Ov9uKY/tests/vfio /tmp/spdk.Ov9uKY 00:08:32.726 12:08:19 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:32.726 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.726 12:08:19 -- common/autotest_common.sh@318 -- # df -T 00:08:32.726 12:08:19 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:32.726 12:08:19 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:32.726 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # avails["$mount"]=4096 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:32.726 12:08:19 -- common/autotest_common.sh@354 -- # uses["$mount"]=5284425728 00:08:32.726 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # avails["$mount"]=52166242304 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61730603008 00:08:32.726 12:08:19 -- common/autotest_common.sh@354 -- # uses["$mount"]=9564360704 00:08:32.726 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # avails["$mount"]=30862708736 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865301504 00:08:32.726 12:08:19 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:32.726 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # avails["$mount"]=12340129792 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12346122240 00:08:32.726 12:08:19 -- common/autotest_common.sh@354 -- # uses["$mount"]=5992448 00:08:32.726 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:32.726 12:08:19 -- common/autotest_common.sh@353 -- # avails["$mount"]=30864347136 00:08:32.727 12:08:19 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30865301504 00:08:32.727 12:08:19 -- common/autotest_common.sh@354 -- # uses["$mount"]=954368 00:08:32.727 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.727 12:08:19 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:32.727 12:08:19 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:32.727 12:08:19 -- common/autotest_common.sh@353 -- # avails["$mount"]=6173044736 00:08:32.727 12:08:19 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6173057024 00:08:32.727 12:08:19 -- common/autotest_common.sh@354 -- # uses["$mount"]=12288 00:08:32.727 12:08:19 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:32.727 12:08:19 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:32.727 * Looking for test storage... 00:08:32.727 12:08:19 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:32.727 12:08:19 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:32.727 12:08:19 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:32.727 12:08:19 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:32.727 12:08:19 -- common/autotest_common.sh@363 -- # mount=/ 00:08:32.727 12:08:19 -- common/autotest_common.sh@365 -- # target_space=52166242304 00:08:32.727 12:08:19 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:32.727 12:08:19 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:32.727 12:08:19 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:32.727 12:08:19 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:32.727 12:08:19 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:32.727 12:08:19 -- common/autotest_common.sh@372 -- # new_size=11778953216 00:08:32.727 12:08:19 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:32.727 12:08:19 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:32.727 12:08:19 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:32.727 12:08:19 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:32.727 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:32.727 12:08:19 -- common/autotest_common.sh@380 -- # return 0 00:08:32.727 12:08:19 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:32.727 12:08:19 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:32.727 12:08:19 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:32.727 12:08:19 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:32.727 12:08:19 -- common/autotest_common.sh@1672 -- # true 00:08:32.727 12:08:19 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:32.727 12:08:19 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:32.727 12:08:19 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:32.727 12:08:19 -- common/autotest_common.sh@27 -- # exec 00:08:32.727 12:08:19 -- common/autotest_common.sh@29 -- # exec 00:08:32.727 12:08:19 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:32.727 12:08:19 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:32.727 12:08:19 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:32.727 12:08:19 -- common/autotest_common.sh@18 -- # set -x 00:08:32.727 12:08:19 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:32.727 12:08:19 -- ../common.sh@8 -- # pids=() 00:08:32.727 12:08:19 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:32.727 12:08:19 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:32.727 12:08:19 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:32.727 12:08:19 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:32.727 12:08:19 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:32.727 12:08:19 -- vfio/run.sh@65 -- # mem_size=0 00:08:32.727 12:08:19 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:32.727 12:08:19 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:32.727 12:08:19 -- ../common.sh@69 -- # local fuzz_num=7 00:08:32.727 12:08:19 -- ../common.sh@70 -- # local time=1 00:08:32.727 12:08:19 -- ../common.sh@72 -- # (( i = 0 )) 00:08:32.727 12:08:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.727 12:08:19 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:32.727 12:08:19 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:32.727 12:08:19 -- vfio/run.sh@23 -- # local timen=1 00:08:32.727 12:08:19 -- vfio/run.sh@24 -- # local core=0x1 00:08:32.727 12:08:19 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:32.727 12:08:19 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:32.727 12:08:19 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:32.727 12:08:19 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:32.727 12:08:19 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:32.727 12:08:19 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:32.727 12:08:19 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:32.727 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:32.727 12:08:19 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:32.727 [2024-11-02 12:08:19.602401] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:32.727 [2024-11-02 12:08:19.602480] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1155295 ] 00:08:32.727 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.727 [2024-11-02 12:08:19.672574] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.986 [2024-11-02 12:08:19.709366] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:32.986 [2024-11-02 12:08:19.709501] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.986 INFO: Running with entropic power schedule (0xFF, 100). 00:08:32.986 INFO: Seed: 765143566 00:08:32.986 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:32.986 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:32.986 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:32.986 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.986 #2 INITED exec/s: 0 rss: 60Mb 00:08:32.986 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.986 This may also happen if the target rejected all inputs we tried so far 00:08:33.502 NEW_FUNC[1/631]: 0x450dd8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:33.502 NEW_FUNC[2/631]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:33.502 #4 NEW cov: 10761 ft: 10703 corp: 2/7b lim: 60 exec/s: 0 rss: 65Mb L: 6/6 MS: 2 CopyPart-CMP- DE: "\3537N<"- 00:08:33.761 #8 NEW cov: 10775 ft: 13719 corp: 3/18b lim: 60 exec/s: 0 rss: 66Mb L: 11/11 MS: 4 CopyPart-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:33.761 #10 NEW cov: 10775 ft: 14752 corp: 4/54b lim: 60 exec/s: 0 rss: 67Mb L: 36/36 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:34.019 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.019 #11 NEW cov: 10792 ft: 15442 corp: 5/60b lim: 60 exec/s: 0 rss: 67Mb L: 6/36 MS: 1 ChangeBinInt- 00:08:34.019 #12 NEW cov: 10792 ft: 15620 corp: 6/96b lim: 60 exec/s: 0 rss: 70Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:34.286 #13 NEW cov: 10792 ft: 16885 corp: 7/107b lim: 60 exec/s: 13 rss: 70Mb L: 11/36 MS: 1 PersAutoDict- DE: "\3537N<"- 00:08:34.286 #14 NEW cov: 10792 ft: 16996 corp: 8/143b lim: 60 exec/s: 14 rss: 70Mb L: 36/36 MS: 1 ChangeByte- 00:08:34.286 #19 NEW cov: 10792 ft: 17554 corp: 9/149b lim: 60 exec/s: 19 rss: 70Mb L: 6/36 MS: 5 CrossOver-CrossOver-CopyPart-ChangeByte-InsertByte- 00:08:34.544 #20 NEW cov: 10792 ft: 17799 corp: 10/167b lim: 60 exec/s: 20 rss: 70Mb L: 18/36 MS: 1 InsertRepeatedBytes- 00:08:34.544 #21 NEW cov: 10792 ft: 17933 corp: 11/173b lim: 60 exec/s: 21 rss: 70Mb L: 6/36 MS: 1 ChangeBit- 00:08:34.803 #27 NEW cov: 10792 ft: 18001 corp: 12/209b lim: 60 exec/s: 27 rss: 70Mb L: 36/36 MS: 1 CopyPart- 00:08:34.803 #28 NEW cov: 10799 ft: 18316 corp: 13/217b lim: 60 exec/s: 28 rss: 70Mb L: 8/36 MS: 1 CMP- DE: "\377\030"- 00:08:35.062 #34 NEW cov: 10799 ft: 18339 corp: 14/255b lim: 60 exec/s: 34 rss: 70Mb L: 38/38 MS: 1 CrossOver- 00:08:35.062 #35 NEW cov: 10799 ft: 18395 corp: 15/291b lim: 60 exec/s: 17 rss: 70Mb L: 36/38 MS: 1 PersAutoDict- DE: "\377\030"- 00:08:35.062 #35 DONE cov: 10799 ft: 18395 corp: 15/291b lim: 60 exec/s: 17 rss: 70Mb 00:08:35.062 ###### Recommended dictionary. ###### 00:08:35.062 "\3537N<" # Uses: 1 00:08:35.062 "\377\030" # Uses: 1 00:08:35.062 ###### End of recommended dictionary. ###### 00:08:35.062 Done 35 runs in 2 second(s) 00:08:35.321 12:08:22 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:35.321 12:08:22 -- ../common.sh@72 -- # (( i++ )) 00:08:35.321 12:08:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.321 12:08:22 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:35.321 12:08:22 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:35.321 12:08:22 -- vfio/run.sh@23 -- # local timen=1 00:08:35.321 12:08:22 -- vfio/run.sh@24 -- # local core=0x1 00:08:35.321 12:08:22 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:35.321 12:08:22 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:35.321 12:08:22 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:35.321 12:08:22 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:35.321 12:08:22 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:35.321 12:08:22 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:35.321 12:08:22 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:35.321 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:35.321 12:08:22 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:35.321 [2024-11-02 12:08:22.270863] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:35.321 [2024-11-02 12:08:22.270922] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1155729 ] 00:08:35.580 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.580 [2024-11-02 12:08:22.338096] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.580 [2024-11-02 12:08:22.374255] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:35.580 [2024-11-02 12:08:22.374417] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.580 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.580 INFO: Seed: 3425137008 00:08:35.839 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:35.839 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:35.839 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:35.839 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.839 #2 INITED exec/s: 0 rss: 60Mb 00:08:35.839 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.839 This may also happen if the target rejected all inputs we tried so far 00:08:35.839 [2024-11-02 12:08:22.684563] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:35.839 [2024-11-02 12:08:22.684597] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:35.839 [2024-11-02 12:08:22.684617] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:36.098 NEW_FUNC[1/638]: 0x451378 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:36.098 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:36.098 #5 NEW cov: 10777 ft: 10313 corp: 2/34b lim: 40 exec/s: 0 rss: 66Mb L: 33/33 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:36.356 [2024-11-02 12:08:23.146711] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:36.356 [2024-11-02 12:08:23.146741] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:36.356 [2024-11-02 12:08:23.146759] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:36.356 #6 NEW cov: 10796 ft: 13854 corp: 3/67b lim: 40 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 ChangeBinInt- 00:08:36.356 [2024-11-02 12:08:23.331639] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:36.356 [2024-11-02 12:08:23.331665] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:36.356 [2024-11-02 12:08:23.331683] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:36.615 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.615 #9 NEW cov: 10813 ft: 14561 corp: 4/72b lim: 40 exec/s: 0 rss: 68Mb L: 5/33 MS: 3 InsertByte-InsertByte-CopyPart- 00:08:36.615 [2024-11-02 12:08:23.535370] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:36.615 [2024-11-02 12:08:23.535393] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:36.615 [2024-11-02 12:08:23.535412] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:36.874 #13 NEW cov: 10813 ft: 14810 corp: 5/76b lim: 40 exec/s: 13 rss: 68Mb L: 4/33 MS: 4 ShuffleBytes-ChangeBinInt-InsertByte-CMP- DE: "\004\000"- 00:08:36.874 [2024-11-02 12:08:23.719128] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:36.874 [2024-11-02 12:08:23.719151] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:36.874 [2024-11-02 12:08:23.719168] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:36.874 #14 NEW cov: 10813 ft: 15818 corp: 6/110b lim: 40 exec/s: 14 rss: 68Mb L: 34/34 MS: 1 InsertByte- 00:08:37.133 [2024-11-02 12:08:23.896333] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:37.133 [2024-11-02 12:08:23.896356] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:37.133 [2024-11-02 12:08:23.896374] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:37.133 #15 NEW cov: 10813 ft: 16541 corp: 7/139b lim: 40 exec/s: 15 rss: 68Mb L: 29/34 MS: 1 InsertRepeatedBytes- 00:08:37.133 [2024-11-02 12:08:24.082199] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:37.133 [2024-11-02 12:08:24.082221] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:37.133 [2024-11-02 12:08:24.082240] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:37.391 #16 NEW cov: 10813 ft: 16967 corp: 8/145b lim: 40 exec/s: 16 rss: 68Mb L: 6/34 MS: 1 InsertByte- 00:08:37.391 [2024-11-02 12:08:24.265751] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:37.391 [2024-11-02 12:08:24.265773] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:37.391 [2024-11-02 12:08:24.265791] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:37.650 #17 NEW cov: 10813 ft: 17236 corp: 9/179b lim: 40 exec/s: 17 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:08:37.650 [2024-11-02 12:08:24.445211] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:37.650 [2024-11-02 12:08:24.445233] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:37.650 [2024-11-02 12:08:24.445251] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:37.650 #18 NEW cov: 10820 ft: 17382 corp: 10/208b lim: 40 exec/s: 18 rss: 68Mb L: 29/34 MS: 1 ShuffleBytes- 00:08:37.909 [2024-11-02 12:08:24.629278] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:37.909 [2024-11-02 12:08:24.629301] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:37.909 [2024-11-02 12:08:24.629319] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:37.909 #19 NEW cov: 10820 ft: 17429 corp: 11/236b lim: 40 exec/s: 9 rss: 68Mb L: 28/34 MS: 1 EraseBytes- 00:08:37.909 #19 DONE cov: 10820 ft: 17429 corp: 11/236b lim: 40 exec/s: 9 rss: 68Mb 00:08:37.909 ###### Recommended dictionary. ###### 00:08:37.909 "\004\000" # Uses: 0 00:08:37.909 ###### End of recommended dictionary. ###### 00:08:37.909 Done 19 runs in 2 second(s) 00:08:38.168 12:08:24 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:38.168 12:08:24 -- ../common.sh@72 -- # (( i++ )) 00:08:38.168 12:08:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.168 12:08:24 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:38.168 12:08:24 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:38.168 12:08:24 -- vfio/run.sh@23 -- # local timen=1 00:08:38.168 12:08:24 -- vfio/run.sh@24 -- # local core=0x1 00:08:38.168 12:08:24 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:38.168 12:08:24 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:38.168 12:08:24 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:38.168 12:08:24 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:38.168 12:08:24 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:38.169 12:08:24 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:38.169 12:08:24 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:38.169 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:38.169 12:08:24 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:38.169 [2024-11-02 12:08:25.002609] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:38.169 [2024-11-02 12:08:25.002667] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1156138 ] 00:08:38.169 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.169 [2024-11-02 12:08:25.070591] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.169 [2024-11-02 12:08:25.106971] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:38.169 [2024-11-02 12:08:25.107155] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.428 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.428 INFO: Seed: 1863173161 00:08:38.428 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:38.428 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:38.428 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:38.428 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.428 #2 INITED exec/s: 0 rss: 61Mb 00:08:38.428 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.428 This may also happen if the target rejected all inputs we tried so far 00:08:38.428 [2024-11-02 12:08:25.389077] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:38.945 NEW_FUNC[1/636]: 0x451d68 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:38.945 NEW_FUNC[2/636]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:38.945 #25 NEW cov: 10758 ft: 10663 corp: 2/67b lim: 80 exec/s: 0 rss: 68Mb L: 66/66 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:38.945 [2024-11-02 12:08:25.849834] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:39.204 #26 NEW cov: 10772 ft: 14328 corp: 3/111b lim: 80 exec/s: 0 rss: 69Mb L: 44/66 MS: 1 EraseBytes- 00:08:39.204 [2024-11-02 12:08:26.013169] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:39.204 #27 NEW cov: 10772 ft: 14685 corp: 4/177b lim: 80 exec/s: 0 rss: 70Mb L: 66/66 MS: 1 ChangeBit- 00:08:39.204 [2024-11-02 12:08:26.175197] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:39.462 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.462 #28 NEW cov: 10789 ft: 15422 corp: 5/244b lim: 80 exec/s: 0 rss: 70Mb L: 67/67 MS: 1 InsertByte- 00:08:39.462 [2024-11-02 12:08:26.338442] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:39.462 #29 NEW cov: 10789 ft: 15665 corp: 6/296b lim: 80 exec/s: 29 rss: 70Mb L: 52/67 MS: 1 EraseBytes- 00:08:39.721 [2024-11-02 12:08:26.501587] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:39.721 #33 NEW cov: 10789 ft: 16180 corp: 7/315b lim: 80 exec/s: 33 rss: 70Mb L: 19/67 MS: 4 CopyPart-ChangeBit-CopyPart-CrossOver- 00:08:39.721 [2024-11-02 12:08:26.674644] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:39.980 #34 NEW cov: 10789 ft: 16397 corp: 8/368b lim: 80 exec/s: 34 rss: 70Mb L: 53/67 MS: 1 InsertByte- 00:08:39.980 [2024-11-02 12:08:26.837333] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:39.980 #35 NEW cov: 10789 ft: 16893 corp: 9/388b lim: 80 exec/s: 35 rss: 70Mb L: 20/67 MS: 1 CopyPart- 00:08:40.238 [2024-11-02 12:08:26.999302] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:40.238 #36 NEW cov: 10789 ft: 17167 corp: 10/441b lim: 80 exec/s: 36 rss: 70Mb L: 53/67 MS: 1 CMP- DE: "\000\000\000\000\000\000\002\000"- 00:08:40.238 [2024-11-02 12:08:27.161208] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:40.496 #37 NEW cov: 10796 ft: 17262 corp: 11/485b lim: 80 exec/s: 37 rss: 70Mb L: 44/67 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\002\000"- 00:08:40.496 [2024-11-02 12:08:27.322712] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:40.496 #43 NEW cov: 10796 ft: 17309 corp: 12/552b lim: 80 exec/s: 21 rss: 70Mb L: 67/67 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\002\000"- 00:08:40.496 #43 DONE cov: 10796 ft: 17309 corp: 12/552b lim: 80 exec/s: 21 rss: 70Mb 00:08:40.496 ###### Recommended dictionary. ###### 00:08:40.496 "\000\000\000\000\000\000\002\000" # Uses: 2 00:08:40.496 ###### End of recommended dictionary. ###### 00:08:40.496 Done 43 runs in 2 second(s) 00:08:40.755 12:08:27 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:40.756 12:08:27 -- ../common.sh@72 -- # (( i++ )) 00:08:40.756 12:08:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.756 12:08:27 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:40.756 12:08:27 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:40.756 12:08:27 -- vfio/run.sh@23 -- # local timen=1 00:08:40.756 12:08:27 -- vfio/run.sh@24 -- # local core=0x1 00:08:40.756 12:08:27 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:40.756 12:08:27 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:40.756 12:08:27 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:40.756 12:08:27 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:40.756 12:08:27 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:40.756 12:08:27 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:40.756 12:08:27 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:40.756 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:40.756 12:08:27 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:40.756 [2024-11-02 12:08:27.708737] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:40.756 [2024-11-02 12:08:27.708828] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1156680 ] 00:08:41.015 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.015 [2024-11-02 12:08:27.778550] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.015 [2024-11-02 12:08:27.814604] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:41.015 [2024-11-02 12:08:27.814754] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.015 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.015 INFO: Seed: 270188861 00:08:41.273 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:41.273 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:41.273 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:41.273 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.273 #2 INITED exec/s: 0 rss: 59Mb 00:08:41.273 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.273 This may also happen if the target rejected all inputs we tried so far 00:08:41.532 NEW_FUNC[1/632]: 0x452458 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:41.532 NEW_FUNC[2/632]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:41.532 #16 NEW cov: 10745 ft: 10652 corp: 2/39b lim: 320 exec/s: 0 rss: 65Mb L: 38/38 MS: 4 ChangeBit-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:41.790 #22 NEW cov: 10762 ft: 14322 corp: 3/111b lim: 320 exec/s: 0 rss: 67Mb L: 72/72 MS: 1 CrossOver- 00:08:42.048 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:42.048 #24 NEW cov: 10779 ft: 15037 corp: 4/187b lim: 320 exec/s: 0 rss: 68Mb L: 76/76 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:42.048 #25 NEW cov: 10779 ft: 15420 corp: 5/225b lim: 320 exec/s: 25 rss: 68Mb L: 38/76 MS: 1 ShuffleBytes- 00:08:42.307 #26 NEW cov: 10779 ft: 15697 corp: 6/263b lim: 320 exec/s: 26 rss: 68Mb L: 38/76 MS: 1 ShuffleBytes- 00:08:42.565 #29 NEW cov: 10779 ft: 15758 corp: 7/392b lim: 320 exec/s: 29 rss: 68Mb L: 129/129 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:42.822 #30 NEW cov: 10779 ft: 15951 corp: 8/441b lim: 320 exec/s: 30 rss: 68Mb L: 49/129 MS: 1 CopyPart- 00:08:42.822 #31 NEW cov: 10779 ft: 16585 corp: 9/513b lim: 320 exec/s: 31 rss: 68Mb L: 72/129 MS: 1 ShuffleBytes- 00:08:43.080 #32 NEW cov: 10786 ft: 16761 corp: 10/637b lim: 320 exec/s: 32 rss: 68Mb L: 124/129 MS: 1 CrossOver- 00:08:43.339 #33 NEW cov: 10786 ft: 16783 corp: 11/767b lim: 320 exec/s: 16 rss: 68Mb L: 130/130 MS: 1 InsertByte- 00:08:43.339 #33 DONE cov: 10786 ft: 16783 corp: 11/767b lim: 320 exec/s: 16 rss: 68Mb 00:08:43.339 Done 33 runs in 2 second(s) 00:08:43.597 12:08:30 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:43.597 12:08:30 -- ../common.sh@72 -- # (( i++ )) 00:08:43.597 12:08:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.597 12:08:30 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:43.597 12:08:30 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:43.597 12:08:30 -- vfio/run.sh@23 -- # local timen=1 00:08:43.597 12:08:30 -- vfio/run.sh@24 -- # local core=0x1 00:08:43.597 12:08:30 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:43.597 12:08:30 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:43.597 12:08:30 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:43.597 12:08:30 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:43.597 12:08:30 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:43.597 12:08:30 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:43.597 12:08:30 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:43.597 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:43.597 12:08:30 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:43.597 [2024-11-02 12:08:30.395175] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:43.597 [2024-11-02 12:08:30.395248] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1157223 ] 00:08:43.597 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.597 [2024-11-02 12:08:30.466047] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.597 [2024-11-02 12:08:30.502771] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:43.597 [2024-11-02 12:08:30.502914] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.856 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.856 INFO: Seed: 2962196576 00:08:43.856 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:43.856 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:43.856 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:43.856 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.856 #2 INITED exec/s: 0 rss: 60Mb 00:08:43.856 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.856 This may also happen if the target rejected all inputs we tried so far 00:08:43.856 [2024-11-02 12:08:30.786039] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:43.856 [2024-11-02 12:08:30.786077] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:43.856 [2024-11-02 12:08:30.786088] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:43.856 [2024-11-02 12:08:30.786120] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:43.856 [2024-11-02 12:08:30.787018] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:43.856 [2024-11-02 12:08:30.787031] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:43.856 [2024-11-02 12:08:30.787047] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:44.372 NEW_FUNC[1/637]: 0x452cd8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:44.372 NEW_FUNC[2/637]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:44.372 #12 NEW cov: 10771 ft: 10748 corp: 2/99b lim: 320 exec/s: 0 rss: 66Mb L: 98/98 MS: 5 ShuffleBytes-InsertByte-InsertByte-CopyPart-InsertRepeatedBytes- 00:08:44.372 [2024-11-02 12:08:31.242098] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.372 [2024-11-02 12:08:31.242135] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:44.372 [2024-11-02 12:08:31.242146] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.372 [2024-11-02 12:08:31.242162] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.372 [2024-11-02 12:08:31.243103] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:44.372 [2024-11-02 12:08:31.243124] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:44.372 [2024-11-02 12:08:31.243140] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:44.631 NEW_FUNC[1/1]: 0x1374b48 in spdk_nvme_opc_get_data_transfer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/nvme_spec.h:1728 00:08:44.631 #13 NEW cov: 10794 ft: 14251 corp: 3/150b lim: 320 exec/s: 0 rss: 67Mb L: 51/98 MS: 1 EraseBytes- 00:08:44.631 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.631 #15 NEW cov: 10815 ft: 15140 corp: 4/195b lim: 320 exec/s: 0 rss: 68Mb L: 45/98 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:44.888 [2024-11-02 12:08:31.634891] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.888 [2024-11-02 12:08:31.634917] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:44.888 [2024-11-02 12:08:31.634928] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.888 [2024-11-02 12:08:31.634959] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.888 [2024-11-02 12:08:31.635911] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:44.888 [2024-11-02 12:08:31.635931] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:44.888 [2024-11-02 12:08:31.635947] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:44.888 #16 NEW cov: 10815 ft: 15482 corp: 5/246b lim: 320 exec/s: 16 rss: 68Mb L: 51/98 MS: 1 ChangeBit- 00:08:44.888 [2024-11-02 12:08:31.821885] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.888 [2024-11-02 12:08:31.821908] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:44.888 [2024-11-02 12:08:31.821918] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.888 [2024-11-02 12:08:31.821934] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.888 [2024-11-02 12:08:31.822895] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:44.888 [2024-11-02 12:08:31.822915] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:44.888 [2024-11-02 12:08:31.822930] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:45.147 #22 NEW cov: 10815 ft: 15778 corp: 6/296b lim: 320 exec/s: 22 rss: 68Mb L: 50/98 MS: 1 EraseBytes- 00:08:45.147 #28 NEW cov: 10815 ft: 16568 corp: 7/347b lim: 320 exec/s: 28 rss: 68Mb L: 51/98 MS: 1 InsertByte- 00:08:45.405 [2024-11-02 12:08:32.190246] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0xa00000000000000 prot=0x3: Invalid argument 00:08:45.405 [2024-11-02 12:08:32.190270] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0xa00000000000000 flags=0x3: Invalid argument 00:08:45.405 [2024-11-02 12:08:32.190281] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.405 [2024-11-02 12:08:32.190297] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.405 [2024-11-02 12:08:32.191248] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:45.405 [2024-11-02 12:08:32.191267] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:45.405 [2024-11-02 12:08:32.191282] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:45.405 #31 NEW cov: 10815 ft: 16899 corp: 8/383b lim: 320 exec/s: 31 rss: 68Mb L: 36/98 MS: 3 EraseBytes-ShuffleBytes-CopyPart- 00:08:45.663 #32 NEW cov: 10815 ft: 17029 corp: 9/428b lim: 320 exec/s: 32 rss: 69Mb L: 45/98 MS: 1 ChangeBinInt- 00:08:45.921 #33 NEW cov: 10822 ft: 17374 corp: 10/473b lim: 320 exec/s: 33 rss: 69Mb L: 45/98 MS: 1 ShuffleBytes- 00:08:45.921 #34 NEW cov: 10822 ft: 17667 corp: 11/524b lim: 320 exec/s: 17 rss: 69Mb L: 51/98 MS: 1 CrossOver- 00:08:45.921 #34 DONE cov: 10822 ft: 17667 corp: 11/524b lim: 320 exec/s: 17 rss: 69Mb 00:08:45.921 Done 34 runs in 2 second(s) 00:08:46.180 12:08:33 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:46.180 12:08:33 -- ../common.sh@72 -- # (( i++ )) 00:08:46.180 12:08:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.180 12:08:33 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:46.180 12:08:33 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:46.180 12:08:33 -- vfio/run.sh@23 -- # local timen=1 00:08:46.180 12:08:33 -- vfio/run.sh@24 -- # local core=0x1 00:08:46.180 12:08:33 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:46.180 12:08:33 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:46.180 12:08:33 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:46.180 12:08:33 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:46.180 12:08:33 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:46.180 12:08:33 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:46.180 12:08:33 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:46.180 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.180 12:08:33 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:46.180 [2024-11-02 12:08:33.140124] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:46.180 [2024-11-02 12:08:33.140196] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1157658 ] 00:08:46.438 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.438 [2024-11-02 12:08:33.209619] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.438 [2024-11-02 12:08:33.245828] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:46.438 [2024-11-02 12:08:33.245984] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.696 INFO: Running with entropic power schedule (0xFF, 100). 00:08:46.696 INFO: Seed: 1409226208 00:08:46.696 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:46.696 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:46.696 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:46.696 INFO: A corpus is not provided, starting from an empty corpus 00:08:46.696 #2 INITED exec/s: 0 rss: 61Mb 00:08:46.696 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:46.696 This may also happen if the target rejected all inputs we tried so far 00:08:46.696 [2024-11-02 12:08:33.551801] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.696 [2024-11-02 12:08:33.551849] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.212 NEW_FUNC[1/638]: 0x4536d8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:47.212 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:47.212 #6 NEW cov: 10779 ft: 10666 corp: 2/74b lim: 120 exec/s: 0 rss: 68Mb L: 73/73 MS: 4 ShuffleBytes-CMP-EraseBytes-InsertRepeatedBytes- DE: "\010\001\000\000\000\000\000\000"- 00:08:47.212 [2024-11-02 12:08:34.036049] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.212 [2024-11-02 12:08:34.036091] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.212 #10 NEW cov: 10793 ft: 13126 corp: 3/120b lim: 120 exec/s: 0 rss: 70Mb L: 46/73 MS: 4 ChangeBit-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:47.470 [2024-11-02 12:08:34.231962] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.470 [2024-11-02 12:08:34.231999] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.470 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:47.470 #11 NEW cov: 10810 ft: 14115 corp: 4/164b lim: 120 exec/s: 0 rss: 70Mb L: 44/73 MS: 1 InsertRepeatedBytes- 00:08:47.470 [2024-11-02 12:08:34.411038] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.470 [2024-11-02 12:08:34.411080] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.728 #12 NEW cov: 10810 ft: 15906 corp: 5/208b lim: 120 exec/s: 12 rss: 70Mb L: 44/73 MS: 1 ChangeBit- 00:08:47.728 [2024-11-02 12:08:34.590578] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.728 [2024-11-02 12:08:34.590608] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.728 #13 NEW cov: 10810 ft: 16183 corp: 6/254b lim: 120 exec/s: 13 rss: 70Mb L: 46/73 MS: 1 ChangeBit- 00:08:47.986 [2024-11-02 12:08:34.770355] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.986 [2024-11-02 12:08:34.770387] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.986 #14 NEW cov: 10810 ft: 16920 corp: 7/312b lim: 120 exec/s: 14 rss: 70Mb L: 58/73 MS: 1 EraseBytes- 00:08:47.986 [2024-11-02 12:08:34.950783] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.986 [2024-11-02 12:08:34.950813] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.244 #15 NEW cov: 10810 ft: 17286 corp: 8/371b lim: 120 exec/s: 15 rss: 70Mb L: 59/73 MS: 1 InsertByte- 00:08:48.244 [2024-11-02 12:08:35.130090] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.244 [2024-11-02 12:08:35.130120] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.502 #16 NEW cov: 10810 ft: 17624 corp: 9/417b lim: 120 exec/s: 16 rss: 70Mb L: 46/73 MS: 1 CMP- DE: "=\375\031\207\014:\177\000"- 00:08:48.502 [2024-11-02 12:08:35.310324] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.502 [2024-11-02 12:08:35.310353] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.502 #17 NEW cov: 10817 ft: 17733 corp: 10/507b lim: 120 exec/s: 17 rss: 70Mb L: 90/90 MS: 1 CrossOver- 00:08:48.760 [2024-11-02 12:08:35.486840] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.760 [2024-11-02 12:08:35.486869] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.760 #18 NEW cov: 10817 ft: 17915 corp: 11/566b lim: 120 exec/s: 9 rss: 70Mb L: 59/90 MS: 1 PersAutoDict- DE: "=\375\031\207\014:\177\000"- 00:08:48.760 #18 DONE cov: 10817 ft: 17915 corp: 11/566b lim: 120 exec/s: 9 rss: 70Mb 00:08:48.760 ###### Recommended dictionary. ###### 00:08:48.760 "\010\001\000\000\000\000\000\000" # Uses: 0 00:08:48.760 "=\375\031\207\014:\177\000" # Uses: 1 00:08:48.760 ###### End of recommended dictionary. ###### 00:08:48.760 Done 18 runs in 2 second(s) 00:08:49.019 12:08:35 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:49.019 12:08:35 -- ../common.sh@72 -- # (( i++ )) 00:08:49.019 12:08:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.019 12:08:35 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:49.019 12:08:35 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:49.019 12:08:35 -- vfio/run.sh@23 -- # local timen=1 00:08:49.019 12:08:35 -- vfio/run.sh@24 -- # local core=0x1 00:08:49.019 12:08:35 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:49.019 12:08:35 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:49.019 12:08:35 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:49.019 12:08:35 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:49.019 12:08:35 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:49.019 12:08:35 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:49.019 12:08:35 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:49.019 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.019 12:08:35 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:49.019 [2024-11-02 12:08:35.872683] Starting SPDK v24.01.1-pre git sha1 726a04d70 / DPDK 22.11.4 initialization... 00:08:49.019 [2024-11-02 12:08:35.872751] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1158068 ] 00:08:49.019 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.019 [2024-11-02 12:08:35.939260] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.019 [2024-11-02 12:08:35.975342] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:49.019 [2024-11-02 12:08:35.975483] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.277 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.277 INFO: Seed: 4143232650 00:08:49.277 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:49.277 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:49.277 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:49.277 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.277 #2 INITED exec/s: 0 rss: 60Mb 00:08:49.277 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.277 This may also happen if the target rejected all inputs we tried so far 00:08:49.535 [2024-11-02 12:08:36.291718] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:49.535 [2024-11-02 12:08:36.291763] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:49.794 NEW_FUNC[1/638]: 0x4543c8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:49.794 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:49.794 #6 NEW cov: 10772 ft: 10395 corp: 2/47b lim: 90 exec/s: 0 rss: 66Mb L: 46/46 MS: 4 CopyPart-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:50.052 [2024-11-02 12:08:36.775423] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.052 [2024-11-02 12:08:36.775465] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.052 #9 NEW cov: 10786 ft: 14075 corp: 3/57b lim: 90 exec/s: 0 rss: 67Mb L: 10/46 MS: 3 CrossOver-InsertByte-CMP- DE: "\234\264Px\015:\177\000"- 00:08:50.052 [2024-11-02 12:08:36.964927] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.052 [2024-11-02 12:08:36.964959] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.310 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:50.310 #10 NEW cov: 10803 ft: 15369 corp: 4/103b lim: 90 exec/s: 0 rss: 68Mb L: 46/46 MS: 1 ChangeByte- 00:08:50.310 [2024-11-02 12:08:37.159721] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.310 [2024-11-02 12:08:37.159753] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.310 #11 NEW cov: 10803 ft: 16087 corp: 5/149b lim: 90 exec/s: 11 rss: 68Mb L: 46/46 MS: 1 ChangeBit- 00:08:50.568 [2024-11-02 12:08:37.339427] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.568 [2024-11-02 12:08:37.339457] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.568 #12 NEW cov: 10803 ft: 16667 corp: 6/203b lim: 90 exec/s: 12 rss: 70Mb L: 54/54 MS: 1 PersAutoDict- DE: "\234\264Px\015:\177\000"- 00:08:50.568 [2024-11-02 12:08:37.521196] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.568 [2024-11-02 12:08:37.521231] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.826 #13 NEW cov: 10803 ft: 16734 corp: 7/257b lim: 90 exec/s: 13 rss: 70Mb L: 54/54 MS: 1 ChangeBinInt- 00:08:50.826 [2024-11-02 12:08:37.705050] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.826 [2024-11-02 12:08:37.705080] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.097 #14 NEW cov: 10803 ft: 17070 corp: 8/303b lim: 90 exec/s: 14 rss: 70Mb L: 46/54 MS: 1 ChangeByte- 00:08:51.097 [2024-11-02 12:08:37.887638] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.097 [2024-11-02 12:08:37.887670] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.097 #15 NEW cov: 10810 ft: 17295 corp: 9/349b lim: 90 exec/s: 15 rss: 70Mb L: 46/54 MS: 1 ChangeBit- 00:08:51.383 [2024-11-02 12:08:38.073451] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.383 [2024-11-02 12:08:38.073483] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.383 #16 NEW cov: 10810 ft: 17483 corp: 10/403b lim: 90 exec/s: 16 rss: 70Mb L: 54/54 MS: 1 CopyPart- 00:08:51.383 [2024-11-02 12:08:38.253143] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.383 [2024-11-02 12:08:38.253174] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.654 #17 NEW cov: 10810 ft: 17811 corp: 11/446b lim: 90 exec/s: 8 rss: 70Mb L: 43/54 MS: 1 EraseBytes- 00:08:51.654 #17 DONE cov: 10810 ft: 17811 corp: 11/446b lim: 90 exec/s: 8 rss: 70Mb 00:08:51.654 ###### Recommended dictionary. ###### 00:08:51.654 "\234\264Px\015:\177\000" # Uses: 1 00:08:51.654 ###### End of recommended dictionary. ###### 00:08:51.654 Done 17 runs in 2 second(s) 00:08:51.654 12:08:38 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:51.654 12:08:38 -- ../common.sh@72 -- # (( i++ )) 00:08:51.654 12:08:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:51.654 12:08:38 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:51.654 00:08:51.654 real 0m19.234s 00:08:51.654 user 0m27.238s 00:08:51.654 sys 0m1.720s 00:08:51.654 12:08:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:51.654 12:08:38 -- common/autotest_common.sh@10 -- # set +x 00:08:51.654 ************************************ 00:08:51.654 END TEST vfio_fuzz 00:08:51.654 ************************************ 00:08:51.913 12:08:38 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:51.913 00:08:51.913 real 1m23.484s 00:08:51.913 user 2m6.497s 00:08:51.913 sys 0m10.424s 00:08:51.913 12:08:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:51.913 12:08:38 -- common/autotest_common.sh@10 -- # set +x 00:08:51.913 ************************************ 00:08:51.913 END TEST llvm_fuzz 00:08:51.913 ************************************ 00:08:51.913 12:08:38 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:08:51.913 12:08:38 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:08:51.913 12:08:38 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:08:51.913 12:08:38 -- common/autotest_common.sh@712 -- # xtrace_disable 00:08:51.913 12:08:38 -- common/autotest_common.sh@10 -- # set +x 00:08:51.913 12:08:38 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:08:51.913 12:08:38 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:08:51.913 12:08:38 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:08:51.913 12:08:38 -- common/autotest_common.sh@10 -- # set +x 00:08:58.497 INFO: APP EXITING 00:08:58.497 INFO: killing all VMs 00:08:58.497 INFO: killing vhost app 00:08:58.497 INFO: EXIT DONE 00:09:00.399 Waiting for block devices as requested 00:09:00.399 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:00.399 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:00.399 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:00.399 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:00.399 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:00.658 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:00.658 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:00.658 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:00.917 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:00.917 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:00.917 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:01.175 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:01.175 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:01.175 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:01.433 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:01.433 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:01.433 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:04.717 Cleaning 00:09:04.717 Removing: /dev/shm/spdk_tgt_trace.pid1121311 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1118837 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1120108 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1121311 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1122056 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1122348 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1122728 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1123061 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1123357 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1123488 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1123711 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1124022 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1124880 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1128015 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1128370 00:09:04.717 Removing: /var/run/dpdk/spdk_pid1128667 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1128695 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1129397 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1129657 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1130544 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1130677 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1130969 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1131137 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1131285 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1131550 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1131924 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1132206 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1132496 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1132593 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1132870 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1132987 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1133205 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1133421 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1133604 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1133773 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1134066 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1134333 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1134614 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1134883 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1135092 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1135232 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1135471 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1135749 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1136031 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1136297 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1136548 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1136704 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1136898 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1137158 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1137452 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1137718 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1138002 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1138167 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1138350 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1138580 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1138865 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1139134 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1139421 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1139638 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1139819 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1139995 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1140278 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1140548 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1140832 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1141107 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1141312 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1141463 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1141707 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1141977 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1142259 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1142527 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1142819 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1142880 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1143210 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1143676 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1144223 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1144611 00:09:04.976 Removing: /var/run/dpdk/spdk_pid1145060 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1145597 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1146012 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1146435 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1146976 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1147415 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1147814 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1148351 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1148831 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1149188 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1149736 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1150190 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1150568 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1151105 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1151550 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1151943 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1152486 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1152894 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1153320 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1153862 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1154183 00:09:05.234 Removing: /var/run/dpdk/spdk_pid1154691 00:09:05.235 Removing: /var/run/dpdk/spdk_pid1155295 00:09:05.235 Removing: /var/run/dpdk/spdk_pid1155729 00:09:05.235 Removing: /var/run/dpdk/spdk_pid1156138 00:09:05.235 Removing: /var/run/dpdk/spdk_pid1156680 00:09:05.235 Removing: /var/run/dpdk/spdk_pid1157223 00:09:05.235 Removing: /var/run/dpdk/spdk_pid1157658 00:09:05.235 Removing: /var/run/dpdk/spdk_pid1158068 00:09:05.235 Clean 00:09:05.235 killing process with pid 1074627 00:09:09.427 killing process with pid 1074624 00:09:09.427 killing process with pid 1074626 00:09:09.427 killing process with pid 1074625 00:09:09.427 12:08:55 -- common/autotest_common.sh@1436 -- # return 0 00:09:09.427 12:08:55 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:09:09.427 12:08:55 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:09.427 12:08:55 -- common/autotest_common.sh@10 -- # set +x 00:09:09.427 12:08:55 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:09:09.427 12:08:55 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:09.427 12:08:55 -- common/autotest_common.sh@10 -- # set +x 00:09:09.427 12:08:55 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:09.427 12:08:55 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:09.427 12:08:55 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:09.427 12:08:55 -- spdk/autotest.sh@394 -- # hash lcov 00:09:09.427 12:08:55 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:09.427 12:08:56 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:09.427 12:08:56 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:09.427 12:08:56 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:09.427 12:08:56 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:09.427 12:08:56 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.427 12:08:56 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.428 12:08:56 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.428 12:08:56 -- paths/export.sh@5 -- $ export PATH 00:09:09.428 12:08:56 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.428 12:08:56 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:09.428 12:08:56 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:09.428 12:08:56 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1730545736.XXXXXX 00:09:09.428 12:08:56 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1730545736.9ha1CK 00:09:09.428 12:08:56 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:09.428 12:08:56 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:09:09.428 12:08:56 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:09.428 12:08:56 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:09.428 12:08:56 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:09.428 12:08:56 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:09.428 12:08:56 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:09.428 12:08:56 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:09:09.428 12:08:56 -- common/autotest_common.sh@10 -- $ set +x 00:09:09.428 12:08:56 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:09.428 12:08:56 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:09.428 12:08:56 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:09.428 12:08:56 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:09.428 12:08:56 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:09.428 12:08:56 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:09.428 12:08:56 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:09.428 12:08:56 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:09.428 12:08:56 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:09.428 12:08:56 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:09.428 12:08:56 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:09.428 + [[ -n 1018761 ]] 00:09:09.428 + sudo kill 1018761 00:09:09.437 [Pipeline] } 00:09:09.452 [Pipeline] // stage 00:09:09.456 [Pipeline] } 00:09:09.468 [Pipeline] // timeout 00:09:09.473 [Pipeline] } 00:09:09.484 [Pipeline] // catchError 00:09:09.489 [Pipeline] } 00:09:09.499 [Pipeline] // wrap 00:09:09.505 [Pipeline] } 00:09:09.523 [Pipeline] // catchError 00:09:09.536 [Pipeline] stage 00:09:09.538 [Pipeline] { (Epilogue) 00:09:09.552 [Pipeline] catchError 00:09:09.554 [Pipeline] { 00:09:09.566 [Pipeline] echo 00:09:09.571 Cleanup processes 00:09:09.577 [Pipeline] sh 00:09:09.859 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:09.859 1166878 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:09.871 [Pipeline] sh 00:09:10.155 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:10.155 ++ grep -v 'sudo pgrep' 00:09:10.155 ++ awk '{print $1}' 00:09:10.155 + sudo kill -9 00:09:10.155 + true 00:09:10.166 [Pipeline] sh 00:09:10.450 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:10.450 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:10.450 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:11.400 [Pipeline] sh 00:09:11.685 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:11.685 Artifacts sizes are good 00:09:11.699 [Pipeline] archiveArtifacts 00:09:11.707 Archiving artifacts 00:09:11.753 [Pipeline] sh 00:09:12.039 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:12.054 [Pipeline] cleanWs 00:09:12.064 [WS-CLEANUP] Deleting project workspace... 00:09:12.064 [WS-CLEANUP] Deferred wipeout is used... 00:09:12.071 [WS-CLEANUP] done 00:09:12.073 [Pipeline] } 00:09:12.090 [Pipeline] // catchError 00:09:12.101 [Pipeline] sh 00:09:12.429 + logger -p user.info -t JENKINS-CI 00:09:12.438 [Pipeline] } 00:09:12.451 [Pipeline] // stage 00:09:12.456 [Pipeline] } 00:09:12.470 [Pipeline] // node 00:09:12.475 [Pipeline] End of Pipeline 00:09:12.525 Finished: SUCCESS